Online safety

Latest developments

The Online Safety Act is a new piece of UK legislation which received the Royal Assent on 26 October 2023. The Act seeks to regulate user-to-user and search services and require them to act against illegal content and content harmful to children. User-to-user services includes services that allow users to post and interact with each other online. Those in scope include a broad range of services available on the internet such as social media, online platforms, online games, or video sharing apps. 

The Act is overseen and regulated by OFCOM, which is in the process of publishing initial draft codes of practice and guidance in a three phased approach. Ofcom anticipates that final versions of its first codes of conduct and guidance will be published and come into force from early 2025 onwards. 

Summary

The Online Safety Act seeks to regulate online providers of user-to-user and search services, as well as providers of pornography services. It will impose a duty of care on them to conduct risk assessments and take proportionate measures to deal with risks.

The duty of care will require online providers to address various types of risks, including illegal content and content harmful to children. Providers will need to take steps to remove harmful content, prevent its re-upload, and ensure that users are protected from such content. They will also need to assess the likelihood that children will use their services and if relevant use measures to prevent children from accessing inappropriate or harmful content. As such, the Act has a strong emphasis on the need for appropriate age assurance measures where necessary, because of duties to assess whether children use services and then restrict children from coming into contact with inappropriate content or parts of a service. 

How could it be relevant for you?

The Act has different tiers of regulation for user-to-user platforms with one for search. The Act imposes additional duties on very large user-to-user services, known as Category 1 services, large search services known as Category 2A services and large user-to-user services known as Category 2B services. Ofcom has published advice it has provided to Government with suggested thresholds for these Categories based on the scale of business and service functionality. Further legislation from Government will then set these thresholds in due course. 

The core duties in the Act are supported by ‘voluntary’ codes of practice alongside additional guidance for providers. The codes of practice mirror the core duties within the Act including those in relation to: 

  • Illegal harms
  • Protecting children 
  • Additional duties applicable to categorised services

Ofcom’s codes are not mandatory and providers can determine their own means of compliance but, in practice, it may difficult to demonstrate how services’ own compliance measures would be as effective as the measures found in the codes. As services plan their compliance, they should be aware that adhering to the codes will be viewed by Ofcom as a way of meeting obligations under the Act and Ofcom’s guidance documents give examples of measures that Ofcom deems insufficient. 

There are various provisions which can impose liability for failure to comply with the law upon Directors and Officers so senior level engagement with compliance obligations will be an important consideration, alongside who in the organisation will be appointed to take responsibility for online safety issues.

Steps can then be taken to assess what compliance steps may be required and/or whether change could be made to change the way the business operates to put the relevant activities out of scope of the legislation. Alternatively, if necessary, you can be well prepared to challenge any classification by Ofcom with which you disagree. 

It is estimated that the Online Safety Act will impact a vast number of entities, from big tech companies to small indie platforms, with an estimated capture of 25,000 entities in the UK alone so a close reading of the provisions to calculate whether your organisation may be in scope will be important at an early stage. 

Next steps

Whilst a large number of organisations are likely to be covered by the Online Safety Act Ofcom will likely prioritize the big tech platforms in its efforts to ensure compliance. Ofcom will focus on companies that it believes are not attempting to comply, but it's important to note that the duties of care will not apply until codes of practice come into force. 

Public consultations are ongoing and will continue in the coming years, so the final requirements applicable to companies in scope remains uncertain. The outcome of the UK’s 2024 General Election could also affect the direction of travel as the Labour Party has said that it plans to review parts of the Act. As such, it is crucial for companies to stay informed, contribute to consultations of relevance to their services, and be prepared to adapt to any changes.

Ofcom is seeking confidence that compliance with the Act is being thought about at senior levels of an organization. The more the organization communicates to Ofcom about governance, the more likely Ofcom will be satisfied with the organization's response and approach. It is worth it for organizations to be planning ahead and conducting thorough preparation for compliance with the Act. To help with this process, it is recommended that companies within the Act's scope proactively conduct risk assessments and demonstrate that they have implemented measures in the codes of practice or prepared their own compliance steps.

Written by Emma Drake and Rory Coutts

*Information is accurate up to 1 July 2024

Explore all chapters

AI regulation

Cryptoasset regulation

Cybersecurity

Data policy and regulation

Digital competition regime

Digital consumer regulation

Digital identity and trust

ePrivacy

Online safety

Telecoms and connectivity