Snap My AI – ICO decision has lessons for everyone undertaking DPIAs

On 19th June 2024, ICO published its decision on Snap’s My AI [ICO Decision on Snap Inc. of 21 May 2024 (accessible here)]. This followed a Preliminary Enforcement Notice (PEN) issued by ICO on 6th October 2023. The PEN contained provisional conclusions that Snap had not met Art.35 UK GDPR (carrying out a DPIA) or Art.36 UK GDPR (consultation with a supervisory authority in the event a DPIA shows unmitigated high risks). The final decision concluded that Snap met the requirements of Art.35  – and that Snap had not breached Art.36. The decision contains detailed commentary on ICO’s expectations as to the level of detail to be included in DPIAs in general and observations on particular areas of concern when engaging with genAI and children. As almost all organisations carry out DPIAs - and as many are doing so in relation to deployment of AI - the decision has wide relevance. The underlying UK law on these provisions is still near-identical to that in the EU and so these insights are also likely to be of relevance beyond the UK.

Facts

My AI is a chatbot, which integrates OpenAI’s GPT. My AI collects user submitted biographical data; user age group (defaulting to the lower of declared or inferred age); coarse location data; metadata; interaction data; keywords (extracted from queries to serve ads); and memories of interactions.  

Snap trialled a pre-release to Snapchat + users on 27th February 2023, with a full release to all users on 19th April 2023. Snap first engaged with ICO on My AI in March/April 2023, with the first substantive interaction after full release of My AI. ICO asked Snap to explain how My AI corresponded to key topics included in a blog post by ICO’s Executive Director of Regulatory Risk and also required Snap to share its DPIA.

ICO issued its PEN to Snap on 6th October 2023. Snap submitted representations to ICO on 3rd November, with further submissions on 11 December 2023, followed by (somewhat unusually) an oral hearing on 14th December 2023. Snap also submitted a revised, 5th, DPIA on 22nd November 2023. Most of the decision analyses this 5th DPIA. 

Overall conclusions

ICO concluded that, unlike the earlier DPIAs shared with it, the 5th DPIA met the requirements of Art.35. In particular, the 5th DPIA:

  • Contained a significantly more detailed breakdown of the processing operations
  • Considered the extent to which gen-AI differed from previous technology used by Snap, relevant to the assessment of whether the processing is necessary and proportionate – including an assessment on the impact of the technology on the volume of special category data processed
  • Contained a more detailed risk assessment – including of risks posed to 13 – 17 year olds
  • Clearly identified mitigations, with explanations as to how and to what extent these would be effective.

    The original DPIAs shared with ICO contained a statement that the processing posed an unmitigated high risk to children (leading to the provisional finding that Snap had breached Art.36 by not consulting with ICO). However, ICO accepted Snap’s evidence that this was an error in the DPIA and that Snap had not considered this to be the case.

    The decision does not attempt to consider how MyAI complies with other aspects of data protection legislation.

    Detailed requirements for DPIAs

    The EDPB guidance on “Criteria for an Acceptable DPIA” is a succinct 1 page[EDPB Guidelines on DPIA (wp248rev.01), Annex 2 – Criteria for an acceptable DPIA (accessible here)], but the corresponding ICO guidance is more extensive [ICO, DPIA Guidance, How do we do a DPIA? | ICO].  The decision assesses Snap’s DPIA against this guidance, calling out areas where ICO originally felt Snap fell short, but where it concludes that the revised DPIA meet ICO’s guidelines. The areas highlighted by ICO are points with which – in our experience – many organisations struggle.

    Systematic description of processing:

  • Categories of data: The DPIA must systematically describe what categories of personal data are used for each purpose of processing. The revised DPIA contained a step-by-step breakdown of this, listing the categories of personal data processed.
  • Access rights: ICO called out that explaining who has access to what personal data is an essential element of processing, so has to be adequately addressed. In the revised DPIA, Snap explained what personal data was shared with each of OpenAI and Microsoft (its ad partner) and how personal data was shared internally (with explanation of how it determined and managed access controls).
  •  Retention periods: again, ICO labels these essential, noting that a controller cannot assess risks in a DPIA without adequate consideration of retention periods. The revised DPIA contained retention periods for each category of data and the retention periods set out in contracts with processors.
  • Volume of data processed and geographic scope: references to unspecified numbers of users or data subjects would not be sufficient; the revised DPIA contained detailed statistics on numbers of average daily and monthly users in the UK, EU and globally and noted that personal data of non-users would also be processed (eg in content submitted). The DPIA also considered the numbers of 13 – 17-year-olds whose personal data would be processed.
  • Context: ICO expected the DPIA to reflect on wider public concerns over use of gen-AI and pointed to news articles and other resources illustrating the types of concerns that should be considered. Interestingly, ICO highlighted the need to consider whether use of gen-AI (by contrast to use of traditional search and chatbot tools) would increase the amount of special category data processed – by encouraging users to share more of this – although Snap’s conclusion in the DPIA was that in fact it was more likely that users would be more cautious about sharing sensitive data with gen-AI.

    Necessity and proportionality of the processing operations in relation to the purposes:

  • What is the impact of this, specific, technology: ICO considered that Snap hadn’t sufficiently considered the impact of the shift from traditional search functionality and online query technology to gen-AI. As noted above, ICO was particularly interested in whether this would alter the nature of the personal data processed.
  • Lawfulness: Snap referenced consent, contractual necessity and legitimate interests. On special category data, it referenced explicit consent, substantial public interest + domestic law and freedom of expression derogations. ICO accepted the points had been addressed but did not undertake a substantive assessment of lawful basis.

    Risk assessment;

  • Longer is better: Snap’s DPIAs followed the traditional format of noting the likelihood that a risk would occur and the severity of the risk, when determining the overall risk score. For example, in relation to the risk that responses provided would be biased, inappropriate or potentially misleading, the earlier DPIAs noted likelihood as probable; severity as significant; and overall risk score as high. ICO disliked the cursory, high-level analysis. The revised DPIA contained more detailed risk assessments, padding out the analysis.
  • Include evidence of alternative measures considered: the revised DPIA contained details of mitigating measures which were considered but not adopted; ICO concluded that this was important to show effective assessment of risks and how they were mitigated.
  • What actual risks did ICO highlight? ICO considered that the earlier DPIAs should have considered: the risks of targeting 13 – 17 year olds for advertising (addressed in the revised DPIA on the basis that Snap only serves contextual not targeted ads via My AI); the processing of special category data on a large scale; and the risk that users – especially 13 – 17 year olds – would be less likely to understand the manner and purpose of processing involving genAI and therefore may not make fully informed decisions about using the technology
  • What additional risks did Snap highlight? Risks from responses being biased etc (see above); increased risks from a security breach if more special category data is processed; unauthorised use of content; risk of 13 – 17 year olds becoming isolated or lonely as a result of excessive reliance on genAI for emotional support; risk of users thinking genAI was being used when it wasn’t (as My AI cannot be removed from the top of the chat feed); risks to safety from using even coarse location data; and risk of third parties having their personal data processed who had not chosen for this to happen.

    Risk mitigation:

  • Ensure the mitigating step is relevant to the risk identified: the earlier DPIAs contained some mitigating steps that were inaccurate or not relevant to the risks identified. This sounds obvious, but can be hard to avoid if you use templates or libraries of suggested steps when completing DPIAs.
  • Child users can increase the risks – you need child-focused measures: the DPIAs identified that risks would be compounded when 13 – 17 year olds used My AI, but the original DPIAs did not explain mitigating measures that were specific for this risk. The revised DPIA addressed this. (For interest, the mitigating measures included age appropriate content filters; just in time notices tailored for this age group; allowing parents to control and surveil their teens use of My AI; mitigations against My AI being used for homework and to write essays).
  • Don’t cross-refer to external documents without explaining their content or effect: the earlier DPIAs contained references to other policies which were stated to completely eliminate certain risks. The revised DPIA explained the specific steps and measures that would be effective to do this.
  • Lastly, during the course of rolling out My AI, Snap had removed a just in time notice warning users that the technology was novel and that they should not provide confidential or sensitive information. The revised DPIA noted that this had been re-instated.

Thoughts on DPIAs

Some of the points highlighted by ICO seem obvious. But it is difficult to avoid errors, cross-references, and one-word entries if DPIAs are completed at scale across an organisation.

Some organisations use DPIAs as their way of assessing all new processing initiatives, or if not all, then processing initiatives where the level of risk would fall below the level of needing a DPIA. One lesson would be to differentiate clearly between “entry level” privacy impact assessments, so as to avoid adopting a process which struggles to meet the more demanding DPIA requirements because it is being deployed on a wider basis.

The incorrect, unmitigated high residual risk designation in the earlier DPIAs should also be something that could be avoided – e.g. by using an automated process ensuring that any DPIAs containing this are checked to mitigate risk of documentation error.

Lastly, ICO’s preference for DPIAs to spell out the logic and analysis behind risk assessments; to spell out steps from other policy documents rather than cross referring without explanation; and to  contain analysis of public areas of concern (eg evidenced by media coverage) feels like an area where both the quality of compliance and the cost of compliance could be significantly improved…  by use of LLMs.

Wider enforcement lessons

The decision also offers a couple of useful insights for others engaging with ICO in relation to investigations and enforcement.

Firstly, the decision makes clear that, as ICO has now concluded that Snap meets the requirements for DPIAs, there are no grounds for ICO to issue an enforcement notice in this regard [36]. If an organisation meets the requirements of data protection law, there can be nothing further to compel them to do, so an enforcement notice would be pointless and cannot be issued. Of course, the situation for retrospective sanctions (such as fines or reprimands) is different.

Secondly, the decision outlines how ICO gathered information in this matter. ICO initially requested information from Snap on an informal basis. Snap provided certain information in this way, but declined to provide a requested DPIA. ICO then issued an information notice requiring production of all DPIAs for MyAI; Snap provided these, but with redactions of business sensitive material, including on grounds of security, eventually providing full DPIAs. ICO often engages with companies informally. It is often more helpful for organisations to do this; once an information notice is served, there is an obligation to provide the requested information, with very little scope to withhold information.

Latest insights

More Insights
Car by beach

China Cybersecurity and Data Protection: Monthly Update - June 2024 Issue

Jul 26 2024

Read More
digital data security

Google Privacy Sandbox Update

Jul 26 2024

Read More
Curiosity line teal background

China’s Data and Privacy Regime for the Civil Aviation Sector is Under Design: An Initial Exploration of Regulatory Blueprint (Part II)

Jul 25 2024

Read More

Related capabilities