The Online Safety Act: Latest Guidance on Illegal Content

The Online Safety Act is now law and it will affect more than just social media companies and search engines. The estimate is that somewhere in the region of 20-25,000 entities in the UK alone will be in scope, and that is because if you operate online and you enable people to interact through what you do there, you are likely to be a ‘user to user’ service covered by the Act and of interest to Ofcom, the regulator in this space.

If you are in scope, the foundation of the new regulatory architecture is a duty to conduct risk assessment, particularly assessing the risks to children presented by what you do and assessing the risk that users of your service will encounter illegal content when they access it. Largely of course this will be illegal content posted or shown by other users of the service. Once you have assessed the risks, you are expected to put in place proportionate measures to mitigate those risks.

So, what does all that mean for your business? Ofcom are trying to provide some of the answers in the extensive guidance, draft codes of practice and other materials they are publishing for consultation. They have not yet done this in relation to potential harms to children online, but they have published various documents on illegal content – over 1500 pages of them in fact. Just in case you don’t have time to read them all, here are some points worth highlighting:

  1. Illegal content is content which ‘amounts to a relevant offence’ and the most serious and relevant of these are listed in the Act and the guidance.
  2. However, working out whether content is illegal may still not be as easy as you might think. Some criminal offences are committed automatically by the existence of the content itself (child sexual abuse material for example) but many offences require a mental element on the part of the alleged offender, and whether a defence can be made out is also a factor. How are you supposed to discover what was in the mind of the individual in question or whether they may be able to rely on a defence?
  3. The answer to that, according to the Act and expanded on by Ofcom in the materials it has published, is that you should ask yourself if you have ‘reasonable grounds to infer’ that content is illegal, specifically whether you have reasonable grounds to infer that all elements necessary for the commission of an offence, including mental elements, are present and that you have no reasonable grounds to infer that a defence may be successfully relied upon. This is not the same standard as you would apply as a member of a jury, deciding whether you are sure ‘beyond reasonable doubt’ that a defendant is guilty, and reaching these conclusions may be difficult in some cases. Ofcom’s guidance will not, we suspect, always help.
  4. It is also important to know how much detective work you have to do to identify illegal content, and again what we have so far will not answer every question. Judgments are to be made on the basis of ‘all relevant information that is reasonably available to the provider’, but does that mean only the material currently in hand or include also material it is within the power of the provider to obtain? The Ofcom guidance doesn’t resolve this either. It does say that ‘services are not required to ask users posting and viewing content about their purposes, before making an illegal content judgment’, which will come as a relief, but on the matter of how much has to be done to investigate beyond information that is typically available to all services, or which requires significant resources to collect, Ofcom say services should have ‘reasonable regard to any other relevant information to which they have access’, which perhaps doesn’t help much unless you know what Ofcom consider to be ‘reasonable’.

Of course it is difficult to be definitive about what should be done and about Ofcom’s assessment of it, as every case will have to be judged on its merits and the variety of services in scope, the nature of their activities and the circumstances of illegal content being found, demonstrate that this could be quite different in each instance. The concept of ‘proportionality’ is central and will mean different things in different contexts. It should also be recognised that Ofcom are finding their way through this new landscape too, but the consultation period on the materials published to elucidate the new duties the Online Safety Act imposes with regard to illegal content is an opportunity to reflect on what the Act will mean for your business and perhaps to seek further clarity, or to suggest Ofcom adjust their intended approach where there is good reason to do so. If you believe you may be in scope and are considering how to respond or how to get that greater clarity, or just want to understand what the Act may mean for you, we are happy to help.

Written by: Rt Hon Sir Jeremy Wright KC MP

Latest insights

More Insights
featured image

Germany: The obligation to provide consumers with an online cancellation button – update on recent rulings

5 minutes Apr 24 2025

Read More
featured image

Netherlands: AMC’s new focus on sustainability claims in food sector

7 minutes Apr 24 2025

Read More
featured image

EUDR Guidance Update: Providing Simplification & Consulting Clarification

8 minutes Apr 22 2025

Read More