Use of AI within the energy sector – Ofgem’s proposals and call for input

Written By

kathryn parker Module
Kathryn Parker

Associate
UK

I am an associate in our International Commercial Group based in London. Having spent 6 months training in our Energy & Utilities sub-group, and 6 months training in our Technology Transactions sub-group, I have experience across a broad range of commercial contracts and projects. This has ranged from international SaaS projects, all the way through to district heating supply projects.

michael rudd Module
Michael Rudd

Partner
UK

I am a projects and regulatory partner and Chair and Co-Head of International Energy & Utilities Sector Group focused on energy innovation. My work has taken me around the globe.

Ofgem has put forward its initial views in a Call for Input published on 5 April 2024 on the existing regulatory framework on the use of AI within the energy sector. While they suggest that there is adequate regulation to capture the use of AI, they believe producing regulatory guidance on the risk-based use of AI in the energy sector will prove valuable. As such Ofgem is seeking views on their proposals about how AI should be used safely and responsibly within the energy sector in a way that fosters innovation. Organisations within the energy sector who either use or are looking to use AI, technology companies, AI developers, and those working on AI policy, amongst others, are asked to respond to the proposals.

Integrating AI into the energy industry is expected to raise unique challenges concerning the regulation of energy. Regulatory implications are wide-reaching and will need to be considered in full, for example, supply chain duties, AI decision explainability, data privacy, cybersecurity, governance and accountability. The task of overseeing and assessing violations of energy regulations linked to AI could present complexities. More specifically, there are potential challenges in the detection of compliance and breaches under current approaches.

The Department for Science, Innovation and Technology’s AI White Paper on AI regulation published in March 2023 set out five principles to be applied by sector regulators. Ofgem has rooted its proposals in these principles. They include:

  1. Safety, security, and robustness
  2. Transparency and explainability
  3. Fairness
  4. Accountability and governance
  5. Contestability and redress

The aim is to develop an outcome-based approach to the regulation of AI within the energy sector which is proportionate and based on the application of principles considered important in managing risk, rather than setting prescriptive rules on the application of AI.

The Call for Input presents a suggested risk framework designed to assist entities in grasping Ofgem’s perspective on risk. It aims to direct ‘dutyholders’, like energy sector licensees and other organisations contemplating AI adoption, towards proportionate actions and to prevent or lessen the likelihood of failures.

Areas considered by Ofgem and key identified risks, proposals, and recommendations

Area  Key Identified Risks  Key Ofgem Proposal / Recommendation 
Consumer  The potential for AI to cause discrimination and inequality / bias.  Ofgem emphasise the importance of the principles of fairness, transparency and explainability. Ofgem concluded that, in addition to an overarching definition of fairness, an analysis to assist in identifying unfairness associated with an AI application is important.  
Market  “Tacit collusion” – the concern that algorithms including AI may provide a way for companies to attain and maintain collusion without any explicit agreements or human communication. Automated mechanisms with deep learning techniques enable companies to track prices, introduce common policies, give market signals, and optimise joint profits. 

Ofgem’s AI taskforce and the Competition and Markets Authority are exploring this issue further. Suggested solutions include:

  • traditional existing ex-post measures, such as market studies (which can lead to non-binding recommendations) and market investigations (which can lead to the imposition of structural and/or behavioural remedies); and
  • alternative, more interventionist approaches, such as: the adaptation of ex-ante merger control; audits of models, algorithms, data and decisions; and reconsideration of the legal approach to agreement and tacit collusion. 
Company  Effective oversight of AI systems by energy companies. 

Appropriate routes for customers to contest a negative AI outcome and seek redress. 

Ofgem proposals include, amongst others:

  • introducing risk-based corporate governance principles which include ensuring companies have in place: organisational AI competency; testing models and procedures; regular review and internal audit; and, amongst others, ways to educate users about the use of AI and identifying errors;
  • clarifying existing routes of contestability and redress and implementing new routes where necessary;
  • establishing clear lines of accountability across the AI life cycle;
  • ensuring good practice is adopted throughout the supply chain; and
  • ensuring effective data management, data architecture and risk management infrastructure. 

Ofgem also set out a table which outlines the various aspects of accountability and governance for effective use of AI in relation to different responsible roles and groups, including board members, management, project teams, life cycle stakeholders, supply chain participants, sector representatives and regulatory bodies. 

Sustainability  AI systems require significant energy and water resources due to their complex design, extensive training data and frequent user interactions. AI’s global demand may require 4.2 to 6.6 billion cubic meters of water withdrawal by 2027 – half of the UK’s annual use. Without sustainable practices it has been warned that AI is likely to consume more energy than the human workforce by 2025.  

Ofgem solutions include:

  • choosing data centres and the training of the models which contribute to the sustainability of AI; and
  • controlling and reducing energy usage through load shifting during peak hours. 

 

Key takeaways

Although Ofgem provides reassurance that existing regulation is adequate, there appears to be considerable effort still required to thoroughly resolve and work through the various legal and regulatory complexities described, in particular, with regards to AI collusion, liability and the AI supply chain, and sustainability. There are also a number of other wider implications to the energy sector discussed in the Call for Input that deserve an equally thorough consideration – for example: the use of AI involves the collection and processing of large amounts of data, raising concerns regarding data privacy and security; human rights and equality; the protection of critical systems from cyber attacks; health and safety; confidentiality; and the impact on the workforce including possible influences/biases or errors in data, or miscorrelations due to insufficient training, data or coding mistakes. What stands out from the Call for Input is Ofgem’s cooperation with other regulators and key industry bodies e.g., Office of AI (DSIT) CMA (see our article on ‘Guiding the development of AI: the CMA’s initial report on AI Foundation Modelshere), ICO, EHRC, National Cyber Security Centre, and the AI Safety Institute, demonstrating a considered and collaborative approach.

The Call for Input closes at 11:59pm on 17th May 2024.

Latest insights

More Insights
Curiosity line pink background

Transforming A Brand into A Global Business – what to consider from a legal perspective

Nov 05 2024

Read More
camera

Landmark decision by the District Court of Hamburg on text and data mining exceptions (Kneschke v. Laion)

Nov 05 2024

Read More
Energy and Utilities 500x333

Unlocking Energy Storage: Revenue Streams and Regulations

Nov 04 2024

Read More