The UK’s online privacy revolution for 2021 – why the Age-Appropriate Design Code is not just for child-focussed sites, and needs attention now

Written By

ruth boardman module
Ruth Boardman

Partner
UK

I am based in London and co-head Bird & Bird's International Privacy and Data Protection Group. I enjoy providing practical advice and solutions to complex legal issues.

emma drake module
Emma Drake

Partner
UK

I am a partner working on data and online safety compliance from our London office. I work with a wide variety of organisations, particularly in the media, sports and life sciences sectors. I also advise extensively on children's and employee privacy matters.

On 2 September 2020, the ICO’s Age Appropriate Design Code, also referred to as the Children’s Code (the “Code”) came into force. This set the clock ticking on a 12-month transition period for organisations offering information society services in or to the UK. The Code has far-reaching implications for operators of affected services, going far beyond the GDPR’s parental consent rules. This article examines which services will need to adapt to the Code’s 15 design standards, and explains why prompt engagement is needed.

A statutory code – effectively mandatory for those offering affected services

The ICO was required to prepare the Code under s.123 of the Data Protection Act 2018 (the “Act”). This section tells us that the Code must apply to “information society services which are likely to be accessed by children.” If a service provider’s offering is caught by the Code, it will be obliged to comply with the Code’s requirements or risk being found wanting under data protection or e-Privacy law. Although the Code is not itself law, the ICO is required – under s.127 of the Act – to take the Code into account in assessing compliance with the Act, the GDPR and the Privacy and Electronic Communication (EC Directive) Regulations 2003. Section 127 also requires courts to take the Code into consideration if it is relevant to any question being considered in proceedings.

A code for any online services children are likely to use – not just child-centric sites and services

As the Code itself explains, its scope goes beyond sites and services designed with a child audience in mind and extends to any service or site where a child is likely to venture. The Code tells us that the ICO will apply a “more probable than not” threshold in determining application of the rules.

The ICO expects organisations to take a “risk-based” approach to evaluating and verifying the age of their users. It suggests organisations should consider “market research, current evidence on user behaviour, the user base of similar or existing services and service types and testing of access restriction measures” as potential methods for this assessment. Organisations believing that their online services fall outside of the code are expected to document their decision, and refer to the methods they have used to come to that conclusion and any methods implemented (if needed) to exclude children from their services.

Consider older teens too – the Code applies up to 18 years of age

A child is defined as a person under 18 years of age. A service which is likely to be accessed by 16 or 17 year olds will need to comply with the AADC. This will draw a large array of general audience sites and services into the Code’s purview, from mainstream news sites and social media to sports bodies, search engines and e-commerce sites.

Across a number of the Code’s design standards, the ICO has divided and differentiated its expectations by age:

0-5  pre-literate and early literacy
6-9  core primary school years
10-12  transition years
13-15  early teens
16-17  approaching adulthood

Approaches to privacy notices, the level of parental input that should be encouraged and acceptable methods of offering children access to their rights, are all areas where ICO suggests different approaches for different age-groups. Whilst the Code emphasises that operators do not need to design their services for age-groups that do not access them, or to adopt these precise age-groups in their methodology, again the onus will be on providers to justify their approach and document their decision making.

Few sites and online activities avoid being “information society services”

The definition of “information society service” gains new scrutiny as operators try to assess whether the Code applies to their activities. The definition is drawn from the Technical Standards and Regulations Directive (EU) 2015/1535 (the “Directive”). It encompasses “any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services”. As the Code acknowledges, it is broad in application and applies to the “vast majority” of online services used by children.

For remuneration – indirect funding is sufficient for the Code

Importantly, this is not limited to paid-for services, and covers services that are free to the user but funded through other activities such as advertising. Widespread use of online advertising has the potential to draw a large number of “free” online services into the Code’s scope. Not-for-profit activities are also covered by the Code, as long as the services reflect those which are typically provided on a commercial basis – this might affect charities, sports governing bodies and religious organisations, where they offer online resources which children are likely to access. However, non-commercial services offered by public authorities, such as education resources offered by state schools, will be out of scope for those bodies. To the extent private sector bodies substantially assist in this provision, there is potential that the Code may apply to their involvement.

At a distance, by electronic means – true offline activities outside scope, but beware online blended services beware

“Real-world” services, requiring in-person appointments, are not caught by the Code’s rules, as they are not offered “at a distance”. Separating the privacy compliance approaches for these offline and online services may prove difficult to manage in practice. Although the ICO says that this exemption extends to websites that provide online information about these services, operators should take care: in 2017, in C 339/15 Vanderborght, the CJEU found that a website advertising dental services was an information society service, and so even websites that promote wholly offline services are still likely be caught by the Code. Sites offering news or entertainment content will also be covered.

At individual request – a matter of clicking not curation

This does not mean that services must be personalised – a general audience news site will still be caught by the definition. This is because the requirement under the Directive is that the service be provided by the “transmission of data on individual request” – this can include providing a website at the click of a button, but does not include simultaneous broadcast of data to an unlimited number of recipients. Scheduled broadcast services, such as television or radio broadcasts, are – therefore – excluded, even where they are provided online. On-demand services, however, are caught by the definition, as they are delivered at the individual request of the user. According to Recital 18 of Directive 2000/31/EC, this also includes provision of commercial communications by electronic mail, although this is not discussed in the Code itself and not all email marketing seems likely to meet this test.

A wide array of online services and websites are included

In summarising the application of the definition of information society service, the Code gives the following examples:

  • apps;
  • social media platforms and online messaging services;
  • content streaming services (such as video, music or gaming services);
  • online games;
  • news and educational sites;
  • websites offering any goods or services over the internet and online marketplaces; and
  • electronic services offering support or control to connected toys or other connected devices.

Although they may be information society services, “preventive or counselling services” are excluded from the Code under s.123(7) of the Act.

Non-UK services that target services to, or monitor behaviour of children in the UK, also have to comply

The Code will apply to any processing that is covered by the Act. In practice, this means that it will apply to any online services offered in the context of the activities of a UK establishment, or to organisations outside the UK whose services are still caught by the UK GDPR through the application of Article 3(2) – namely, because it is apparent that the services are intended to be offered to children in the UK, or involve the monitoring of children’s behaviour in the UK. As a result of Brexit, this will include organisations established in the EEA. Again, the Code applies if these services are likely to be accessed by children – they need not be directly targeted at children.

The European Data Protection Board has produced detailed guidance on the factors to take into consideration when determining whether processing takes place in the context of the activities of an establishment, or whether the extra-territorial scope of Article 3(2) applies. Our article on this guidance can be found here.

Do not delay – affected service providers face substantial practical hurdles to timely compliance

The Code says that operators of affected sites and services must design and adapt their services “with the best interests of the child” as a “primary consideration”. Requirements include:

  • checking age, where necessary to determine the right protections to offer (if this is unclear, the standards should be applied to all users);
  • ensuring that services are appropriate for children’s developmental needs, including effective and enforced safeguards from inappropriate content and excessive screen time;
  • offering choice over most processing which isn’t necessary for the core service, and avoiding nudges to lower-privacy options;
  • alerting users if geo-location, tracking or parental monitoring are turned on;
  • having privacy settings which set a high level of privacy by default, and that allow different device users to make different choices; and
  • adopting privacy notices and media which are written and/or designed so that children can understand them, and that are offered at the right time.

Whilst there are some clear legal tasks, such as the completion of data protection impact assessments, operators of affected sites and services cannot hope to comply through legal work alone.

Online service providers should take action now to determine whether the Code applies, and to assess the impact of the Code’s obligations on their services. As many will have learned in applying privacy by design and default requirements during GDPR projects, this require buy-in from multiple non-legal stakeholders and can suffer from significant design, test and implementation delay. Organisations have until 2nd September this year to comply with the AADC. With under four months left to comply, many will already find that they are locked out of technology budgets and development cycles that allow this deadline to be met.

Click here to see our flyer.

Latest insights

More Insights
Curiosity line yellow background

China Cybersecurity and Data Protection: Monthly Update - December 2024 Issue

17 minutes Dec 23 2024

Read More
featured image

Guiding through ‘the maze of food labelling’ – The most recent European Court of Auditors’ special report

6 minutes Dec 20 2024

Read More
featured image

EDPB weighs in on key questions on personal data in AI models

1 minute Dec 20 2024

Read More