ICO issues new guidelines on the processing of children's data

Written By

ruth boardman module
Ruth Boardman

Partner
UK

I am based in London and co-head Bird & Bird's International Privacy and Data Protection Group. I enjoy providing practical advice and solutions to complex legal issues.

The ICO has released its new code on age-appropriate design. It imposes strict rules on information society service providers whose services are likely to be accessed by children. As drafted, compliance with the code may require substantial design and operational changes.

The code (available here) is open for public consultation until 31 May 2019.

 

On Wednesday 8 May, Bird & Bird will be organising a round-table discussion on the code and will submit the output of those discussions to the ICO. Due to the potential impact of the code, we would encourage you to attend, by registering here.

 

To whom does it apply?

 

The code applies to information society services, provided for remuneration, that may be accessed by children. The code defines children as individuals under the age of 18. It is irrelevant whether the service is addressed to children; the code also applies to services that are likely to be accessed by children. Given the age 18 definition of a child, this will mean that a large number of online services (not just those targeting children) will be affected by the code. Providers who do not consider their sites in scope will either need robust age-gating measures (see below) or will need evidence – such as from site design and surveys of users – to be able to demonstrate that the site is not being used by under 18s.

 

Age verification

The code states that simply asking someone to self-declare their age is not sufficient. Verification of age is required. The code promotes the use of third-party age verification services. However, caution is needed when relying on such third parties as the code also requires that the service providers conduct appropriate due diligence on their third party's age verification process.

 

Territorial scope

In line with the DPA 2018, the code applies to all service providers that are:

  • based in the UK

  • based outside the UK but have an establishment in the UK (such as an office or branch); or

  • Based outside the EEA and are offering their services to or monitoring users, based in the UK.

Remuneration

Most online services, such as ad-supported websites, applications and social media platforms fall within the scope of the code, as they are regarded as activities which are usually provided for remuneration.

 

What are the principles?

The code provides 16 interdependent principles that service providers must consider. These are:

 

Privacy by default

Building on the GDPR principle of privacy by design, the code requires that privacy settings are set to high by default. Additionally, existing privacy settings must be reset to high within a specified time period (still to be specified).

 

The use of nudging techniques to encourage children to make poor-privacy decisions (such as making the consent button big and colourful and hiding the do not consent button) is not permitted. The code goes one step further and encourages pro-privacy nudges that guide children to make informed decision and/or consult with their parents when presented with difficult privacy choices.

 

Profiling

Profiling must be de-activated by default and only used if appropriate safeguards are in place (such safeguards could be identified as part of the DPIA).

 

If profiling is used to tailor content, the service provider is responsible for ensuring that any recommendations made are appropriate for children.

 

Geolocation

The collection of geolocation information should be set to ‘off’ by default and the granularity of such location data should be carefully considered. Unless necessary, the collection of precise geolocation data should be avoided.

 

For services that enable the sharing of location data with other users, it must always be clear to users when location sharing is switched on, and such sharing must be switched off after each session by default.

 

Sharing data

Personal data should not be disclosed to third parties unless there are compelling reasons to do so (such as safeguarding the children’s safety). In this regard, the sharing of children's data should also be switched off by default.

 

Data minimisation

In line with GDPR, the code reinforces the need for service providers to minimise the data collected as part of their services. In particular, the code promotes avoiding collection of real world identifiers (such as name, email, phone no.), and encourages the use of avatars and pseudonyms instead.

 

Additionally, the code calls out that children’s data should only be collected when the service is actively and knowingly being used by the child.

 

Consider children’s interest

Service providers must consider and support children’s rights while balancing these against the role parents play in supervising their children. Personal data should not be used in a way which is detrimental to the children's physical or mental health and wellbeing. As such, service providers should follow industry body recommendations (such as those issued by the Government, CAP, etc.)

 

The code forbids the use of strategies to extend a user's engagement with the service. Additionally, service providers should allow and encourage children to take breaks by introducing features such as pause buttons, that allow for such breaks to be taken without losing any progress made.

 

For services with parental controls, children must be made aware of when and what parental controls are in place (especially those that allow monitoring). This is of particular importance as children mature, so as to ensure that their fundamental right to privacy is preserved.

 

Transparency

Reiterating one of the key principles of the GDPR, the code requires that clear information notices are provided to both the children and parents. It also encourages the use of just in time notices in combination with prompts for children to speak with their parents when presented with difficult privacy choices. The code also promotes the use of child friendly ways to deliver information notices, such as cartoon, videos, diagrams, graphics, etc.

 

In relation to connected toys and devices, the passive collection of personal data should be avoided and such devices must include clear indicators (such as lights or audio cues) to notify users when data is being collected.

 

Governance and accountability

Service providers should build internal accountability programmes and implement policies that support and demonstrate compliance with the code and other relevant legislation. Additionally, staff (particularly those involved in the design of the service) should receive training to ensure that they are aware of their responsibilities.

 

To the extent that users are provided with online tools that facilitate the exercise of their data related rights (e.g. access, rectification), such tools must be easy to find, clear to use, age appropriate and, where possible, specific to the rights they support. Furthermore, the code encourages the use of mechanisms or tools to track progress and communications relating to a user’s data related rights.

 

Data Privacy Impact Assessments

The code makes it clear that the provision of online services relating to children requires a DPIA (as a reminder, Art 35 GDPR provides that national supervisory authorities can make lists of activities that require a DPIA). For more information on DPIAs, see our previous article Data Protection Impact Assessments: Final Guidance Issued.

 

Also in the news

The CNIL has recently released details of its 2019 enforcement programme. Rather than focusing on specific processing activities, the 2019 enforcement programme seeks to investigate major and cross-sector themes, such as the sharing of responsibilities between processors and subcontractors, and the processing children’s data (photos, biometric data and CCTV in schools and parental consent for children under 15).

 

Further details on the CNIL 2019 enforcement strategy and 2018 activity report can be found here.

Latest insights

More Insights
featured image

EDPB weighs in on key questions on personal data in AI models

1 minute Dec 20 2024

Read More
flower

NEWSFLASH - The UK’s New Consultation on AI and Copyright: Purr-suing Balance?

Dec 19 2024

Read More
laptop phone

EU/UK sanctions regarding Russia and Belarus (16-12-2024)

Dec 19 2024

Read More