Online Safety Bill under the microscope

Written By

mihnea dumitrascu Module
Mihnea Dumitrascu

Associate
UK

As an associate in our London-based international Privacy & Data Protection practice, I advise UK and international clients across a variety of sectors on a wide range of international data and privacy issues. This includes core regulatory frameworks such as the General Data Protection Regulation (GDPR) or the ePrivacy directive, and emerging EU data laws.

james moss Module
James Moss

Partner
UK

I am a partner in Bird & Bird's London-based international Privacy & Data Protection practice. My background with the UK Information Commissioner's Office combined with experience as a regulatory law specialist in private practice gives me unrivalled insight into contentious data protection work and enforcement action.

The Online Safety Bill is a new piece of legislation that is currently in its third reading stage in the House of Lords, with Royal Assent scheduled for around November 2023. It seeks to regulate online providers of user-to-user content and search, as well as providers of pornography. It will impose a duty of care on them to conduct risk assessments and take proportionate measures to deal with risks.

The duty of care will require online providers to address various types of risks, including harmful content, cyberbullying, and online grooming. Providers will need to take steps to remove harmful content, prevent its re-upload, and ensure that users are protected from it. They will also need to ensure that age verification and age assurance measures are in place to prevent children from accessing inappropriate content.

Here are some key aspects of the bill that were discussed during a recent talk as part of Bird & Bird’s Annual Data Protection Update:

1. ARE YOU IN SCOPE?

The bill will have different tiers of regulation for user-to-user platforms with only one for search. The Bill will impose additional duties on very large user-to-user services, known as Category 1 services. Ofcom alongside Government will work to determine the threshold for Category 1. Research will be required to determine where the threshold should lie, which will probably be based on the scale of business, functionality, and a third criterion that is not yet known.

The rapidly evolving landscape of the digital world has led to the need to be able to reclassify businesses. Platforms that were not even in existence five years ago now have a massive user base and therefore a greater responsibility towards online safety. To address this issue, a new subcategory will be provided for providers that are on the brink of becoming Category 1.

2. IS THE BILL LIKELY TO CHANGE?

While the basic structure of the Online Safety Bill is unlikely to change, there will be amendments and clarifications to ensure its effectiveness. Definitions of “primary priority content that is harmful to children”, “priority content that is harmful to children”, 'age assurance', and 'age verification' have already been added at the report stage in the House of Lords, with Ofcom tasked with determining what constitutes adequate measures.

The House of Lords also introduced a chapter dedicated to deceased children whose death has a social media connection to clarify the situation involving the families of such deceased children, with Ofcom tasked to produce guidance to help providers comply with their duties in this area.

3. HOW TO DO RISK ASSESSMENTS?

The Online Safety Bill will impact a vast number of entities, from big tech companies to small indie platforms, with an estimated capture of 25,000 entities in the UK alone. However, Ofcom will be likely to prioritise the big tech platforms in its efforts to ensure compliance. To help with this process, it is recommended that companies within the bill's scope proactively conduct risk assessments and demonstrate that they have considered potential risks. Ofcom will focus on companies that it believes are not attempting to comply, but it's important to note that the duties of care will not apply until codes of practice are published. Public consultations will take place in the coming months, and the direction Ofcom will take remains uncertain. As such, it is crucial for companies to stay informed and prepared to adapt to any changes.

Regarding the practice of risk assessments, Ofcom is seeking confidence that the process is being thought about at senior levels of an organisation. The more the organisation communicates to Ofcom about governance, the more likely Ofcom will be satisfied with the organisation's response and approach. It is worth it for organisations to be planning ahead and conducting thorough assessments.

Ofcom recently announced that it plans to release the initial draft codes of practice soon after it acquires its powers, as opposed to the original plan of doing so within 100 days. Additionally, it anticipates publishing draft guidance on age verification in autumn 2023, followed by further codes in the coming months.

4. POLICING CONTENT ON PLATFORMS: LEGAL BUT HARMFUL?

The concept of ‘legal but harmful’ for adults has been somewhat watered down from previous proposals. The bill now gives users the capacity to say if they want to see harmful content or not (e.g., a tick box), which is not too different in practice from previous proposals for platforms. Indeed, in practice they will still have to find a way to deal with this type of content.

5. POLICING CONTENT ON PLATFORMS: ILLEGAL CONTENT

It will be difficult in many cases for providers to identify content as illegal where illegality requires a mental element or the absence of a valid defence. Ofcom will have to do some work interpreting the relevant Clause of the bill, which requires that the provider must make that judgment based on 'reasonably available' information, and there is a good opportunity for businesses to influence the direction that Ofcom takes here.

6. CRIMINAL LIABILITY FOR DIRECTORS:

The UK Government has stressed that the bill is intended to have extraterritorial effect, both with penalty notices and individual criminal liability. Penalties and individual criminal liability could be imposed on companies that fail to comply with the bill's regulations. Criminal liability would only come into play if Ofcom asks a director or company to take action, and they do not respond positively or ignore the request. If this happens, Ofcom could issue a penalty notice, and the director could be held individually responsible for the failure to comply.

7. WHAT ABOUT END-TO-END ENCRYPTION?

A particular issue to note is the power Ofcom will have to issue notices requiring ‘accredited technology’ to identify, take down and prevent users encountering CSEA (‘child sexual exploitation and abuse content’), whether it is communicated publicly or privately by means of the relevant service albeit there has been a new requirement added at Report Stage for a Skilled Person report before Ofcom can issue such a notice. This power would seem to encompass the use of technology which would be incompatible with end-to-end encryption, though Ofcom may only issue such notices if it considers it ‘necessary and proportionate’ to do so, meaning that, as will be the case elsewhere in the legislation, the meaning of ‘proportionate’ will be important in this context.

8. WHAT ABOUT INTERNAL USER-TO-USER FUNCTIONALITIES?

If an organisation makes use of a wholly internal user-to-user functionality, then it will not be in scope of the bill. Organisations with user-to-user platforms that are mostly internal but include some external functionality may want to switch off the external parts of the platform to remain out of scope.

The Online Safety Bill is a significant piece of legislation that will impact a wide range of online providers. Companies should take proactive steps to comply with the bill's regulations, including conducting risk assessments and staying informed about potential changes. There is still time for companies to influence the direction of the codes of practice that will need to be published by Ofcom.

If you have a question about the Online Safety Bill, please do get in touch with Jeremy Wright, James Moss or Mihnea Dumitrascu. You can watch some of the other recordings from Bird & Bird’s Annual Data Protection Update here.

Latest insights

More Insights
featured image

EDPB weighs in on key questions on personal data in AI models

1 minute Dec 20 2024

Read More
Curiosity line blue background

Australia’s first standalone cyber security law – the Cyber Security Act 2024

Dec 18 2024

Read More
Curiosity line pink background

The New Cybersecurity Dawn – Hong Kong readies for new critical infrastructure legislation

7 minutes Dec 10 2024

Read More