CJEU RULING IN THE MATTER OF “SCHUFA” NOT ONLY AFFECTS CREDIT SCORING (C-634/21)

25.04.2024

In December 2023, the European Court of Justice (CJEU) had to decide yet another case that will have a significant effect beyond the core facts of the case. The ruling will likely affect not only credit scoring agencies but every sector and controller that works with some probability analysis to predict and ultimately influence data subjects’ decisions or may legally affect them otherwise.

1. CASE FACTS

SCHUFA Holding AG (“SCHUFA”) is a private credit agency based in Wiesbaden, Germany. The company’s core business is to assess the creditworthiness of natural persons and legal entities based on millions of data and provide its contractual partners with a prognosis on the probability of future payment behavior of a person in the form of a ‘Score.’ The establishment of Scores (‘scoring’) is based on the assumption that, by assigning a person to a group of other persons with comparable characteristics who have behaved in a certain way, similar behavior can be predicted.

The case originally started with a natural person (“data subject”) being refused a loan by a bank presumably due to a negative credit score provided by SCHUFA. The data subject therefore made a request according to Art 15 and 17 GDPR to SCHUFA to get information on the personal data stored and to erase some of its personal data which allegedly was incorrect. In response, SCHUFA informed the data subject of its score and outlined, in broad terms, the methods for calculating the scores but refused to disclose how the calculation was made, referring to ‘trade secrecy’. Lastly, SCHUFA stated that it limited itself to sending information to its contractual partners, and it was those contractual partners who made the actual contractual decisions. The data subject then filed a complaint with the competent supervisory authority, the Hessian Commissioner for Data Protection and Freedom of Information (Hessischer Beauftragter für Datenschutz und Informationsfreiheit), which also did not react to the data subject’s expectations, so he appealed to the Administrative Court of Wiesbaden.

The Administrative Court of Wiesbaden found that the question of whether the automated establishment of a probability value concerning the ability of a data subject to get a loan in the future already constitutes a decision based solely on automated processing, including profiling, according to Art 22 GDPR, was a question of fundamental legal importance. The Administrative Court of Wiesbaden therefore issued a request for a preliminary ruling to the CJEU according to Art 267 Treaty on the functioning of the European Union (“TFEU”) to clarify how Art 22 GDPR should be interpreted in this case.

 

2. HELD

According to the decision of the CJEU, the applicability of Art 22 GDPR is subject to three cumulative requirements, namely:

  1. there must be a ‘decision’;
  2. such decision must be based solely on automated processing, including profiling; and,
  3. it must produce legal effects concerning the interested party or similarly significantly affect a data subject.

The term ‘decision’ is not defined by the GDPR. Nevertheless, the CJEU states that it is apparent from the wording of Art 22 GDPR and Recital 71 of the GDPR, that ‘decision’ not only refers to acts which produce legal effects concerning the person at issue but also to acts which similarly significantly affect that person. Therefore, the broad scope of a decision within the meaning of Art 22 GDPR includes the result of calculating a person’s creditworthiness in the form of a probability value regarding that person’s ability to meet payment commitments in the future.

According to the CJEU, profiling within the meaning of Art 4 no. 4 GDPR is a sub-case of Automated Decision Making (“ADM”) pursuant to Art 22 GDPR. The activity of SCHUFA would clearly fall under profiling within the meaning of Art 4 no. 4 GDPR and therefore the second requirement of Art 22 GDPR is met.

As regards, thirdly, the condition that the decision must produce ‘legal effects’ concerning the person at issue or affect him or her ‘similarly significantly’, the CJEU held that it was apparent from the very wording of the first question of the Administrative Court of Wiesbaden, that the action of the third party to whom a Score is transmitted from SCHUFA draws strongly on that value. Thus, according to the findings of the referring court, in the event where a loan application is sent by a consumer to a bank, an insufficient Score leads to a refusal of the loan applied for in almost any case. Therefore, the establishment of a Score by a credit agency must be qualified as a decision producing legal effects concerning the data subject or similarly significantly affecting him or her.

However, the CJEU did not leave it at that; it also ruled that any automated decision-making is always associated with profiling and vice versa. Otherwise, according to the CJEU, there would be a risk of circumventing Art. 22 GDPR and, consequently, a legal loophole by excluding the usage of a third-party score to make own decisions from the scope of Art 22 GDPR.

What the CJEU has not conclusively ruled on is the question of when a decision similarly significantly affects a data subject. But since the ruling is held very broadly and loan decisions of banks highly depend on scores by credit agencies, this question will likely come up sooner or later.

 

3. CONSEQUENCES OF THE RULING

However, the ruling of the CJEU in the case of SCHUFA does not only affect credit scoring but also some other areas like certain practices of marketing and e-commerce as well as artificial intelligence (“AI”).

Art 22 GDPR determines a mandatory ban on processing personal data, as soon as the three requirements mentioned above are met, with an exhaustive list of exceptions in its paragraph 2. All these exceptions include appropriate measures to safeguard the rights and freedoms as well as legitimate interests of data subjects, either by law or the controller. This means, that every practice that works with the analysis of consumer behavior (profiling and therefore ADM), which is somehow connected to decision-producing legal effects concerning the data subject or similarly significantly affecting him or her, is not allowed. As we do not know when the threshold of significance within the meaning of Art 22 GDPR is met, this could also mean that customer segmentation falls in the scope of Art 22 GDPR.

Customer segmentation, for example, tries to predict consumer behavior of different groups of customers based on the analysis of personal aspects to adapt marketing measures accordingly. This leads to some kind of probability value (Score) as to how likely it is for a person of a certain group of customers to buy a certain product or service, which ultimately may (significantly?) affect a person regarding a purchase.

This could also apply to affinity analysis in a similar way. For this analysis, a large amount of data is analyzed with regard to correlations between different factors. It is used in the diagnosis of illnesses but also in shopping basket analysis, which ultimately aims to affect buying and usage behavior. So again, this procedure works with probability values like customer segmentation and credit scoring and may significantly affect a person’s buying behavior if marketing measures are adapted solely based on that Score.

Finally, it is worth mentioning artificial intelligence in this context. Since AI always works with probabilities, it is often used to analyze human aspects and behavior, i.e., profiling according to Art 4 no. 4 GDPR. Therefore, this kind of AI has to comply with Art 22 GDPR if the results lead to an automatic decision having legal effects concerning the data subject or similarly significantly affecting him or her. Finally, according to the latest draft of the AI Act, an AI system shall always be considered high-risk if the AI system performs profiling of natural persons. Therefore, it is subject to strict transparency requirements. In the context of the SCHUFA ruling and the above, this will likely become a big question mark in the implementation of the AI Act in the following years.

 

Article provided by INPLP member: Stephan Winklbauer (aringer herbst winklbauer rechtsanwälte, Austria)

 

 

Discover more about the INPLP and the INPLP-Members

Dr. Tobias Höllwarth (Managing Director INPLP)

What is the INPLP?

INPLP is a not-for-profit international network of qualified professionals providing expert counsel on legal and compliance issues relating to data privacy and associated matters. INPLP provides targeted and concise guidance, multi-jurisdictional views and practical information to address the ever-increasing and intensifying field of data protection challenges. INPLP fulfils its mission by sharing know-how, conducting joint research into data processing practices and engaging proactively in international cooperation in both the private and public sectors.