CJEU confirms preparatory acts can be automated individual decisions: the SCHUFA cases

Written By

nils loelfing module
Dr. Nils Lölfing

Counsel
Germany

I am a counsel in our Technology & Communications Sector Group. I provide pragmatic and solution-driven advice to our clients on all issues around data and information technology law, with a strong focus on and experience with AI and machine learning projects.

ruth boardman module
Ruth Boardman

Partner
UK

I am based in London and co-head Bird & Bird's International Privacy and Data Protection Group. I enjoy providing practical advice and solutions to complex legal issues.

On 7 December, the Court of Justice of the European Union issued its judgment in the SCHUFA case (case C-634/21). The CJEU ruled that a credit reference agency engages in automated individual decision- making when it creates credit repayment probability scores as a result of automated processing and where lenders rely heavily on these scores (to establish, implement, or terminate contracts). This means the obligation to comply with Article 22 of the EU General Data Protection Regulation (“GDPR”) falls on the credit reference agency rather than just on the lender. Article 22 only allows the use of automated individual decision-making in limited situations, e.g., contractual necessity, specific justifications under European Union or member state law, or explicit consent. Where individual decision-making is allowed, protections must be in place — including ensuring that individuals can obtain human intervention, express their views, and challenge decisions made about them. Regulators and other stakeholders have stated the decision will have ramifications for all automated decision-making services and providers offering predictive artificial intelligence tools. In our view, this is an overstatement; it overlooks the fact that the decision is specific to situations where a provider's input is used in a way that could significantly affect individuals and where the provider's input is heavily weighted. The decision could have implications for organisations other than credit reference agencies if they provide critical decision-making support.

On the same day, the CJEU also issued a decision in joined cases C-26/22 and 64/22 – this time relating to the retention of insolvency data by CRAs, in line with a data protection authority (“DPA”)-approved code of conduct. In this case, the CJEU considered the extent to which credit reference agencies could retain insolvency data once it had ceased to be available in a public register. The CJEU also commented on how the right to object to legitimate interests-based processing and erasure interact and held that data subjects have a right to a full judicial review of decisions by DPAs. Courts are not restricted to a more limited, process-oriented, review of whether a DPA has received a complaint, investigated it, and communicated the outcome to the complainant.

CASE 1: WHEN DOES A PREPARATORY ACT ITSELF BECOME AN AUTOMATED INDIVIDUAL DECISION?

The facts

SCHUFA provides credit information about individuals to lenders using a probability-based scoring system. An individual, OQ, was denied credit after SCHUFA supplied credit information about her. OQ made an access and erasure request to SCHUFA and while it provided data, it refused to provide details on how it determined the score, citing trade secrets. OQ lodged a complaint with the Hamburg DPA, the Commissioner for Data Protection and Freedom of Information, which rejected her claim. OQ appealed the decision, and the Administrative Court handling the appeal referred the case to the CJEU for a ruling on whether the SCHUFA scoring system constituted an automated individual decision under Article 22(1) GDPR.

Preparatory acts can be decisions under Article 22(1)

Article 22 of the GDPR provides that automated individual decision-making can only be used in certain situations. Outside these situations, such decisions are prohibited. Where automated individual decision-making is allowed, safeguards must be in place. The CJEU rejected SCHUFA's argument that it was only engaged in preparatory acts and that any decisions were taken by the lender. Instead, the CJEU held the company itself was engaging in automated individual decision-making.

The CJEU highlighted three conditions that need to be met to be considered engaged in automated decision-making: a decision must be made; it must be based solely on automated processing, including profiling; and it must produce legal effects concerning the individual or otherwise produce an effect that is equivalent or similarly significant in its impact on the individual. According to the CJEU, all of these conditions were met in this case.

The CJEU noted that the concept of decision is broad and includes acts that may affect the individual in various ways, including the calculation of a credit score. It also noted the calculation of the creditworthiness score would have significant effects. The court stated this was clear from the question posed by the referring court, which said lenders would rely heavily on the score. Based on the factual conclusions of the court, it can be deduced that when a consumer submits a loan application to a bank, a low probability value results in the bank rejecting the loan request in nearly all instances.

The implication is that if a credit reference agency or other similar provider issues a score that is not relied on heavily by those taking the end decision – for example, because lenders attach significant weight to other factors – then the issuing of the score would not be covered by Article 22. In the authors' experience, credit reference agency contracts typically insist that lenders should not base a decision solely on the score and should consider other factors before issuing a score. The CJEU's decision, therefore, is based on a questionable premise. This will cause a lot of uncertainty and make the judgment difficult to apply. This is already apparent; for example, the Hamburg DPA and other commentators have stated that the decision can be more generally applied to AI-based decision-making systems. But this does not consider whether these "decisions" are just one (non-determinative) factor in the end decision by the party dealing with the data subject.

Risk of gaps in protection

The court underlined that if SCHUFA was not subject to Article 22, it would not be obliged to provide the individual with meaningful information about the logic involved in the decision. The CJEU noted the lender would likely not have this information and would be unable to provide it to the individual. This would leave a gap whereby the individual was not effectively protected. This risk explains the broad approach taken by the CJEU; there may be scope to differentiate situations that would not pose this risk.

Impact of Article 22 applying

The CJEU summarised that where Article 22 is applicable, it prohibits decisions from being taken on an automated basis unless an exception applies. The exceptions allow automated individual decision-making solely where:

  • The decision is necessary to enter into or perform a contract between the data subject and a controller.
  • It is authorised by EU or member state law (meeting certain requirements.
  • The data subject has given explicit consent.

In this case, the referring court had concluded that the only applicable derogation would be EU or member state law, so the CJEU did not consider this further. This aspect of the decision is also likely to generate questions: it is clear why consent would be problematic (as an individual who did not give consent would likely experience detriment, which would mean that consent was not freely given and so not valid). However, arguments could be made that the decision is necessary for the lender and data subject to conclude a contract.

The CJEU noted that – in line with Recital 71 – suitable measures to safeguard the rights and freedoms of individuals must always be taken. These measures include the right to human intervention, the right to express one's point of view, and the right to challenge the decision (although this is only mentioned in Article 22, where decisions are necessary for a contract or taken with the data subject's consent). The safeguards should also include use of appropriate mathematical or statistical methods, implementing technical and organisational measures to minimise the risk of error, and taking account of risks, including of discrimination.

Impact on German law

Paragraph 31 of the German Federal law on data protection sets out rules for the use of credit scores (but not the creation of scores). The CJEU noted that it was for the referring court to determine if these rules could amount to a member state law authorising the use of automated individual decision-making, i.e., an exemption from the prohibition under Article 22. When considering this, the CJEU not-ed that any processing of personal data to take an automated individual decision must also have a lawful basis for processing under Article 6 of the GDPR and that it was not open to member states to introduce new conditions under Article 6. Where the lawful basis for processing is legitimate interests, then the controller wishing to rely on this must conduct a balancing test to determine whether the interests or rights and freedoms of individuals outweigh the interests of the controller. The CJEU commented that member state law could not definitively prescribe the outcome of this balancing test.

CASE 2: RIGHTS TO OBJECT, CODES OF CONDUCT, RETENTION OF INSOLVENCY DATA

The facts

UF and AB were subject to insolvency proceedings in Germany. They obtained an early discharge of their debts. Under German law, the public register discontinued publication of information about the proceedings after six months. SCHUFA retained the data for three years, in line with a code of conduct for credit reference agencies approved by the competent DPA. UF and AB asked SCHUFA to delete the data; when it refused, citing the code of conduct, they complained to the DPA for Land Hessen – which, in turn, refused to order SCHUFA to delete the data. UF and AB brought proceedings against the DPA in court, which referred a number of questions to the CJEU.

For how long can SCHUFA retain insolvency data? What’s the role of a code of conduct?

German law provides that information is to be made public in the insolvency register for a period of six months. The CJEU noted that German law has determined that – after this period – the interests of individuals should take precedence over the interests of the public to have access to information about insolvencies. As German law had already determined that the interests of data subjects should prevail after six months, the interests of the private sector in having access to the information could not justify a longer retention period.

The proceedings also raised the question of whether it was appropriate for SCHUFA to retain a copy of the insolvency information during the initial six-month period (on the basis the information could be accessed from the insolvency register during this period). The CJEU noted that this was a matter for the referring court to decide on.

SCHUFA's retention of insolvency records was in line with a code of conduct, which had been approved by the competent DPA. However, the CJEU noted that a code of conduct could not have the effect of making lawful something which is actually unlawful.

Right to object to processing and erasure

The CJEU confirmed that the right to erasure under Article 17 applies, i.e., where personal data has been unlawfully processed. This would, therefore, apply to the retention of data beyond the six months during which it was available in the public insolvency register.

The CJEU restated the individual's right to object to processing under Article 21, where the lawful basis for processing is Article 6(1)e or f – noting that after a data subject has objected, the controller must cease to process the personal data unless the controller can demonstrate compelling legitimate grounds to continue processing which would override the data subjects' interests. If the controller fails to provide this proof, then the data subject can also ask for the data to be erased, under Article 17.

Article 21 provides that a data subject has the right to object "on grounds relating to his or her particular situation." Interestingly, the CJEU does not reference this phrase; instead it seemingly rewrites the GDPR to describe a general right that applies in all situations.

Individuals have a right to a full judicial review of the decision of a DPA

Lastly, the DPA for Land Hessen had argued that UF and AB did not have the right to a full judicial re-view of the authority’s decision. Instead, the authority argued the GDPR grants individuals a more limited right to confirm if the DPA has handled the complaint, investigated it and confirmed the outcome to the complainant. The CJEU rejected this, concluding that data subjects have the right to a full judicial review of a decision by a DPA.

Latest insights

More Insights
featured image

EDPB weighs in on key questions on personal data in AI models

1 minute Dec 20 2024

Read More
Curiosity line blue background

Australia’s first standalone cyber security law – the Cyber Security Act 2024

Dec 18 2024

Read More
Curiosity line green background

The New Cybersecurity Dawn – Hong Kong readies for new critical infrastructure legislation

7 minutes Dec 10 2024

Read More