The FCA publishes its expectations of UK financial services firms adopting or deploying the use of AI

Financial services is an area where consumers, businesses and regulators alike are harnessing the power of digitisation and AI. Financial services firms are using automated processes in approving loans or in trading securities. Payments firms are using AI tools for transaction monitoring and to address the increasing problem of fraud. Regulators are using AI to tackle scams.

It is unsurprising, then, that new innovations in AI is a key focus for the FCA – not only to manage risks within the industry surrounding accessibility and security, but also to improve its own use of AI to detect complex forms of market abuse. In light of this, on 22 April 2024, the FCA published its AI update.

The FCA’s focus

The FCA’s update welcomes the Government’s pro-innovation strategy on AI and outlines the FCA’s roles and objectives in relation to AI, its work so far, its existing approach, and its plans for the next 12 months.

Recognising that financial services are already at the forefront of digitisation and AI could significantly transform the way firms serve their customers and clients - in both retail and wholesale financial markets, the FCA emphasises it is a technology-agnostic, principles-based and outcomes-focused regulator which does not typically mandate or prohibit specific technologies.

In particular, the FCA is focused on how firms can safely and responsibly adopt the technology as well as understanding what impact AI innovations are having on consumers and markets. It sees beneficial innovation as a vital component of effective competition, and is taking a proactive approach to understanding emerging technologies.

What has the FCA done so far?

Discussion Papers/Surveys: The FCA has already published a number of documents in relation to its approach to AI.

  • The AI Discussion Paper (2022) (published jointly with the Bank of England and the PRA) acknowledges the potential AI has to make financial services and markets more efficient, accessible and tailored, whilst still noting the regulatory risks and the need for the three organisations to collaborate to mitigate these risks.
  • The Feedback Statement (2023) (acknowledges responses to the 2022 AI Discussion Paper, most of which were supportive responses from stakeholders)
  • The AI Public-Private Forum Final Report (2022)
  • The 2019 & 2022 machine learning surveys (set out the increasing use of machine learning applications, and the benefits, risks and constraints of machine learning from the perspective of financial services firms)

Collaboration with other regulators: The FCA collaborates closely with the ICO, CMA and Ofcom through the Digital Regulation Cooperation Forum (DRCF) to supervise and research the impact of digital technologies. 

FCA Innovation Hub: The FCA Innovation Hub includes support services to help firms launch innovation products and services. This includes:

  • Regulatory Sandbox: allows firms to test innovative propositions in the market with real consumers
  • Innovation Pathways: helps firms understand how regulation relates to their activities 
  • Digital Sandbox: offers GDPR-compliant datasets, mentorship from industry experts, and access to the FinTech Community to help enable experimentation and scaling for proof of concepts

Existing approach

Approach to the Government’s five principles: The Government identified the following five principles as key when it comes to the regulation of AI in the UK, which the FCA AI update discussed in turn in order to highlight the relevance of existing FCA regulations and guidance in complying with each principle.

  1. Safety, security, robustness
    The FCA refers to its Principles of Business (such as due skill, care and diligence), its Threshold Conditions, and its Senior Management Arrangements, Systems and Controls (SYSC) sourcebook.

    The FCA also highlights the relevance of its work on operational resilience outsourcing and CTPs, with reference to the SYSC 15A requirements which aim to ensure relevant firms are able to respond to, recover, learn from and prevent future operational disruptions.

    The FCA is engaging with the CMA on the CMA’s market investigation into cloud services and are engaging in joint consumer research on generative AI with the CMA through the DRCF.

  2. Fairness
    The FCA states its regulatory approach to consumer protection is particularly relevant to fairness in the use of safe AI systems by firms, and reminds firms to consider their obligations under the Consumer Duty.

    The FCA refers to its Guidance for firms on the fair treatment of vulnerable customers, which expects firms to take vulnerable consumers into account at all stages of the product and service design process, as well as the potential outcomes. For instance, while AI chatbots can help customers understand products or services, an example of a risk to this principle is that some customers might even be excluded from the market as a result of biases being embedded by the use of AI in risk assessments.

    Furthermore, the FCA reminds firms of their obligations under other regulations and laws, such as the GDPR and Equality Act.

  3. Appropriate transparency and explainability
    Whilst the FCA’s regulatory framework does not specifically address the transparency or explainability of AI systems, the FCA refers once again to its consumer protection approach (such as the Consumer Duty to act in good faith, which is characterised by honesty, fair and open dealing with retail consumers).

  4. Accountability and governance
    The FCA refers again to its Principles of Business, Threshold Conditions, and SYSC sourcebook, as well as the Senior Managers and Certification Regime (SM&CR), which all emphasise the importance of accountability and having a clear organisational structure with well defined, transparent and consistent lines of responsibility, effective processes to identify, manage, monitor, and report risks, and internal control mechanisms such as sound administrative and accounting procedures.

  5. Contestability and redress
    The FCA reminds firms that they remain responsible for ensuring compliance with the FCA’s rules, including in relation to consumer protection. Where a firm’s use of AI results in a breach of such rules (e.g. because an AI system produces decisions or outcomes which cause consumer harm), there are a range of mechanisms through which firms can be held accountable and through which consumers can get redress.

    For example, firms are required to maintain their own complaints handling procedures to ensure that complaints (including those about AI decisions relating to financial services) are handled fairly and promptly.

    The FCA points to Chapter 1 of the ‘Dispute Resolution: Complaints’ Sourcebook (DISP) contains rules and guidance detailing how firms should deal with complaints.

The FCA’s own use of AI

In its Data Strategy (2022), the FCA laid out its aim in becoming a digital and data led regulator. According to the FCA, AI has already transformed the speed with which it can monitor and tackle scam websites, money laundering and sanctions breaches. For instance, the FCA currently uses web scraping and social media tools that are able to detect, review and triage potential scam websites.

The FCA currently has an Advanced Analytics unit which is using AI to develop additional tools to protect consumers and markets, such as tools to monitor scam websites as well as an in-house synthetic data tool for Sanctions Screening Testing that has transformed our assessment of firms’ sanctions name screening systems. 

The FCA is particularly interested in how AI can help identify more complex types of market abuse that are currently difficult to detect, such as cross-market manipulation.

FCA’s plan for the next 12 months

Over the next 12 months, the FCA plans to:

  1. Further its understanding of AI deployment in UK financial markets to ensure any potential future regulatory adaptations are proportionate to the risks and to create a framework for beneficial innovation. The FCA is currently involved in diagnostic work on the deployment of AI across UK financial markets and will run a third edition of the machine learning survey with the Bank of England.

  2. Build on existing foundations – particularly as LLMs are making regulatory regimes, such as those relating to operational resilience, outsourcing and critical third parties even more central to the FCA’s analysis.

  3. Collaborate with the PSR, the Bank of England, regulated firms, civil society, academia and international peers to build its empirical understanding and intelligence and create consensus on best practice and potential future regulatory work.

  4. Further prioritise international engagement on AI. The FCA is closely involved in the work of the International Organization of Securities Commissions (IOSCO), including the AI working group, and supports the work of the Financial Stability Board (FSB). The FCA is also a core participant in other multilateral forums on AI, including the Organisation for Economic Co-operation and Development (OECD), the Global Financial Innovation Network (GFIN) and the G7. 

  5. Develop the pilot AI and Digital Hub with DRCF member regulators.

  6. Invest in more AI technologies to proactively monitor markets, including for market surveillance purposes and to explore potential further use cases involving Natural Language Processing to aid triage decisions, assessing AI to generate synthetic data or using LLMs to analyse and summarise text.

  7. Further research. For example, as part of the DRCF Horizon Scanning & Emerging Technologies workstream in 2024- 2025, the FCA will conduct research on deepfakes and simulated content following engagement with stakeholders.

Our Financial Regulation team will be monitoring next steps and shall keep you up-to-speed with the latest developments on how the financial services sector is facing the opportunities and challenges presented by AI.

Latest insights

More Insights
Curiosity line teal background

Five Key Considerations for M&A in the Retail and Consumer Sector

Jun 28 2024

Read More
Suspension bridge over water at sunset

Are you ready for Hong Kong’s Cybersecurity law?

Jun 28 2024

Read More
City skyline at dusk

Economic Crime and Corporate Transparency Act 2023 – Implications of the UK Act for Australian companies

Jun 28 2024

Read More