Leading the Future – New AI Governance Guidance for Australian Directors

Written By

aaron chan Module
Aaron Chan

Special Counsel
Australia

I am a Special Counsel in our Corporate Group, based in the Sydney office. I have experience in a broad range of corporate transactions, covering both mergers and acquisitions and equity capital markets.

carl ritchie Module
Carl Ritchie

Associate
Australia

I am an Associate in the Sydney Corporate Group practicing in M&A (public and private), capital markets, securities regulation and corporate advisory with a focus on the energy & utilities and retail & consumer sectors.

With artificial intelligence (“AI”) on the agenda of countless boardrooms across Australia, the Australian Institute of Company Directors (in partnership with the Human Technology Institute at the University of Technology Sydney) has published the following suite of resources (“AI Governance Guidance”) to assist boards navigate the use of this transformative technology in an ethical and informed manner:

1. A Director’s Introduction to AI comprised by three chapters on the following topics aimed at familiarising directors with essential AI concepts and issues:

  • an introduction to AI, its applications and relevance for directors (Chapter 1);
  • a discussion of AI opportunities and risks (Chapter 2); and
  • an exploration of Australian and international regulatory obligations related to AI systems (Chapter 3).

2. An AI Governance Checklist aimed at directors of small and medium-sized enterprises (SMEs) and not-for-profit organisations (NFPs) listing recommended steps to be taken at different stages of the AI governance development process.

3. A Director’s Guide to AI Governance aimed at directors of ASX300 entities who are using (or considering using) AI within their organisations. The director’s guide is comprised by the following two sections:

  • Section 1, which focuses on insights and implications related to AI governance for directors; and
  • Section 2, which sets out case studies and the key questions directors should be asking management/themselves with regards to existing AI organisational practices. It also sets out the following governance elements and steps identified as key to establishing an effective, safe and responsible AI governance framework:

ELEMENT

WHAT SHOULD BOARDS BE DOING?

Roles and Responsibilities

 

  • Identifying the management and board individual/body accountable for AI decision-making.
  • Identifying those involved in, and responsible for, AI system procurement, development and use.
  • Considering whether decision-making processes applied by key accountable persons incorporate consideration of AI risk and opportunity.

People, Skills & Culture

 

  • Verifying that management have assessed the organisation’s AI skills, capabilities and training needs, and implementing upskilling programs (including at the director-level).
  • Discussing the potential for AI to impact the workforce and workforce planning.
  • Considering how AI governance structures can incorporate a diversity of perspectives, including expert views, to aid diversity of thought and avoid ‘group think’.

Governance Structures

 

  • Determining which existing or new board and management governance structure would most appropriately support AI oversight.
  • Reviewing board and management committee charters to determine whether and how they incorporate AI issues.
  • Considering how external experts can be leveraged within existing governance structures.
  • Considering the nature and frequency of management reporting to the board/ relevant board committee.

Principles, Policies & Strategy

 

  • Requiring that AI is considered and, where appropriate, embedded, within the organisation’s strategy. AI use should have a clear business value – ‘AI for AI’s sake’ should be avoided.
  • Engaging with management to discuss how safe and responsible AI principles have been incorporated into relevant policies (such as AI/IT use, privacy, confidentiality and cyber security).
  • Recognising that principles and policies need to be proactively implemented and enforced across the supply chain.

Practices, Processes & Controls

 

  • Working with management to understand what controls are in place for AI use (e.g. risk appetite statement and risk management framework).
  • Confirming with management that there are processes in place to assess supplier and vendor risk.
  • Monitoring and regularly reviewing the effectiveness of controls.

Stakeholder Engagement &

Impact Assessment

 

  • Identifying and engaging with stakeholders to understand AI’s impact and stakeholder expectations of AI use and governance.
  • Confirming with management that AI system design and assessment processes incorporate accessibility and inclusion practices.
  • Considering whether AI-generated results/outcomes are explained to stakeholders and whether an appeal process is available.

Supporting Infrastructure

 

  • Confirming awareness of where, within the organisation, AI is currently being used.
  • Verifying that management is aware of, and has a robust data governance framework in place to manage data collected and stored by the organisation to train AI systems.
  • Focusing on increasing transparency to end users about how the organisation's AI systems use data.

Monitoring, Reporting & Evaluation

  • Confirming that a risk-based monitoring and reporting system for mission-critical and high-risk AI systems is in place.
  • Developing and implementing a monitoring and reporting framework with metrics and outcomes to track and measure progress.
  • Considering seeking internal and external assurance.

 

Key Takeaway

The AI Governance Guidance is not intended to “cover the field” but is instead premised on providing boards with foundational knowledge of AI and a suggested framework for oversight of its use.

With approximately two-thirds of Australian organisations using (or planning to use) AI technology, [1] company directors must quickly become familiar with how their organisation incorporates AI, the risks involved and their evolving responsibilities in light of their common law and statutory duties (“Directors’ Duties”) owed to their company to:

  • act with due care, skill and diligence in exercising their powers and carrying out their functions as a director (section 180 of the Corporations Act 2001 (Cth) (“Act”)); and
  • exercise and discharge their duties as director in good faith in the best interests of the company and for a proper purpose (section 181 of the Act).

As AI technology, policy and regulation continue to evolve at a rapid pace, the AI Governance Guidance is a valuable resource to assist directors in discharging their Directors’ Duties in a manner that manages the balance between capitalising on the strategic and competitive benefits of employing AI technology in their organisations and the inherent risks of its use.

[1] Human Technology Institute,  The State of AI Governance in Australia (Report, May 2023) 12 <https://www.uts.edu.au/sites/default/files/2023-05/HTI%20The%20State%20of%20AI%20Governance%20in%20Australia%20-%2031%20May%202023.pdf>

Latest insights

More Insights
The European Commission Modern office buildings in Brussels, Belgium.

VAT in the Digital Age (“ViDA”): prepare your business with Bird & Bird – 10 key insights for success

Nov 15 2024

Read More

Hungary: Easing the tax burden of innovative startups – from January 2025, the IP contributions will become tax-free

Nov 14 2024

Read More
featured image

Ireland – EU AI Act Update

2 minutes Nov 12 2024

Read More