Artificial Intelligence (AI) and the independent non-executive director

When carrying out board reviews we have seen that artificial intelligence (AI) is playing an increasingly important role in the life of companies. However, while we see boards requesting information and challenging executives on how AI is being, or will be, deployed in the business, we have observed that few boards seem to be considering how AI can be used to improve their own effectiveness.


NEDs cannot ignore AI.


NEDs can be forgiven for feeling uncertain about how they can use AI to support their governance roles. The term AI captures a wide range of concepts from Assisted AI, which supports humans in making decisions, through Augmented AI, which can help in the creation of new ideas, to systems which have the potential to make their own decisions without human intervention.


Many directors, even those with a technology background, are unlikely to be experts in AI but it is incumbent on them have a reasonable understanding of how AI works, the benefits it could bring, the risks and how these can be mitigated.   


Directors cannot ignore AI, legislators and others are already questioning whether a failure by a board to use AI may in future be seen as evidence of negligence.


What can AI do for NEDS?


Even basic forms of AI can help NEDs. A complaint we often hear from NEDs, particularly in regulated businesses, is the time spent at board meetings reviewing reported materials. Carefully calibrated AI could be used to manage this burden, streamlining the process of identifying issues helping a board to be more focussed.


AI can aid non-executive directors in searching and analysing commercial and operational data, enabling them to gain a deeper understanding of the day-to-day activities of the business and enhance the support, challenge, and constructive feedback they provide to the executive team.


The potential uses of AI are many and various. In a short article it is not possible to discuss the many potential ways in which AI could be used but examples include:


  • AI’s ability to review large amounts of data can support risk analysis, enabling a more effective assessment of both future risk and existing risks supporting both the company’s risk function and its risk committee.
  • a board could use AI to undertake due diligence searching public data sources on a proposed strategic partner ahead of a decision to agree a joint venture.
  • Autonomous Intelligence, where a machine makes decisions independently but operates only within a predefined range, could be used to complete certain tasks such as researching, negotiating and agreeing utility contracts more effectively, efficiently and at a fraction of the cost of a human team.
  • AI could also be used to support board committees including, for example, searching audit data for the audit committee and enabling the nomination committee to undertake searches for and, subject to legal requirements, due diligence on potential new board members; and
  • Enabling boards to get closer to the needs and aspirations of a company’s most important asset – its people – through assessing staff data and surveys.


AI risk for NEDs


Although AI offers NEDs many potential benefits there are also significant potential risks for boards. Poor or biased data will lead to flawed outcomes, the old adage ‘rubbish in rubbish out’ applies equally to AI as it does to established technology. The quality of any data that is used by a company’s AI capability needs to be ensured.


Faulty or sub-standard AI technology could be disastrous for a company. A board will need to complete due diligence on any AI potential capability before it gets anywhere near adoption. This may mean an external assessment by a suitably qualified independent third party.  Boards need to consider how to manage AI acquisition risk, there is no set answer to this but creation of an AI sub-committee, appointment of an AI lead or establishment of an AI working party may be options. Given the importance and potential risks associated with AI, the board might expect to take a particular interest in the hiring of individuals with AI expertise.


Directors need to be able to explain their decisions. Some ‘closed’ AI technology teaches itself and can make decisions without human input. Directors may feel uncomfortable relying on such technology other than in very specific lower risk situations. One question a NED may want to ask is ‘if I rely as a director on black box capability how would I explain my decision to a regulator if I am challenged?’


Boards should also consider establishing a protocol to require directors to disclose when they have independently chosen to use AI to assist them in performing their role and to require them only to use AI systems which have been approved by the company.


The ability of NEDs to use AI to process and reassess data carries the risk that a NED may be tempted to second guess decisions made by executives. NEDs will, more than ever, need to be mindful that their role is to support and provide constructive challenge to executives not to step into the executive arena.


AI development is in a state of permanent revolution, keeping abreast of developments in AI is likely to be a challenge for NEDs for the foreseeable future.   


There has been some speculation more complex AI will eventually remove the need for human directors, but we are not there yet. AI has no legal personality and company law still expects to see human judgement when decisions are made and human liability if things go wrong. For the time being at least directors will continue to be held accountable for board decisions.


Further reading:

FRC: Insight report: AI, Emerging Tech and Governance

House of Lords Library: Artificial Intelligence: Development, risks and regulation



Peter Snowdon is a legal and corporate governance expert, with a particular interest in issues affecting financial services firms, banks and investment firms. A former partner at Norton Rose, he also worked for the Financial Services Authority (FSA) prior to joining Bvalco

 Share this article on LinkedIn!

By Wayne Osbourne 19 Apr, 2024
Join expert panellists’, Chris Stamp, Alex Cameron, and Alison Gill as they discuss the importance of embracing dissent in board decision-making, challenging the misconception of unanimous agreement, exploring alternative decision-making methods, and managing conflict constructively.
05 Feb, 2024
Claire Beasley & Sue Willis invite you to an evening of networking and stories. 5th March 2024 6PM - 9PM The Century, 61-63 Shaftesbury Ave, W1D 6LQ
Share by: