In December 2023, the Court of Justice of the European Union (CJEU) delivered the first judgment on automated decision-making. The case concerns SCHUFA, a German company that deals with creditworthiness assessment (private credit information agency). Based on data analysis and the use of statistical tools, it calculates the probability of how a given person will behave in the future (e.g., whether he or she will be able to pay off the debt). This result (score) is then transferred, for example, to banks, which – based on such scoring – decide whether to grant or not a loan. The ruling confirms a broad interpretation of what constitutes a decision made in an automated manner and contains several requirements that should be fulfilled to guarantee compliance with the rights of data subjects in situations where this type of solution is used. Below I present some of the most important conclusions that come to mind after reading the judgment.
Dr. Joanna Mazur, DELab UW analyst
The case before the CJEU concerned a situation in which a third party refused to grant a loan to OQ. OQ questioned the decision since it was made using incorrect personal data and demanding their removal. During the proceedings that arose from this situation, the national court decided to refer questions to the CJEU regarding how the provisions of the General Data Protection Regulation (GDPR) relating to automated decision-making should be understood:
Is Article 22(1) of the [GDPR] to be interpreted as meaning that the automated establishment of a probability value concerning the ability of a data subject to service a loan in the future already constitutes a decision based solely on automated processing, including profiling, which produces legal effects concerning the data subject or similarly significantly affects him or her, where that value, determined using personal data of the data subject, is transmitted by the controller to a third-party controller and the latter draws strongly on that value for its decision on the establishment, implementation or termination of a contractual relationship with the data subject? (para. 27)
The CJEU’s response contains several interesting elements. The first group refers to the assessment of whether we are dealing with automated decision-making, and the second – to the broader context of the provisions on this issue in the GDPR. Below I discuss them divided into these two subgroups.
Scoring as automated decision-making
The CJEU analyzed the extent to which the situation should be subject to the regulation of automated decision-making under Art. 22 GDPR. For this purpose, three conditions were analyzed that must be jointly met for Art. 22 was applicable (para. 43). The first one, namely, whether we are dealing with a decision, was confirmed by the CJEU. The Court referred to the need for a broad understanding of the word decision: The concept of ‘decision’ within the meaning of Article 22(1) of the GDPR is thus […] capable of including several acts which may affect the data subject in many ways since that concept is broad enough to encompass the result of calculating a person’s creditworthiness in the form of a probability value concerning that person’s ability to meet payment commitments in the future (para. 46).
The second requirement is for such a decision to be based solely on automated processing, including profiling. The CJEU resolved it to a large extent by recognizing that a given situation constitutes an example of profiling (para. 47). Additionally, the Tribunal referred to the wording of the question referred for a preliminary ruling, treating the way it was formulated as a confirmation of the automated nature of the assessed data processing. Thus, the CJEU avoided consideration of the extent to which the final decisions regarding whether or not to grant a loan are based ‘solely’ on automated processing and focused on the very fact that the scoring decision is automated.
The third condition, which is the need for the decision to produce legal effects or similarly significantly affect a given person, was also found to be met by the CJEU: It follows that, in circumstances such as those at issue in the main proceedings, in which the probability value established by a credit information agency and communicated to a bank plays a determining role in the granting of credit, the establishment of that value must be qualified in itself as a decision producing vis-à-vis a data subject ‘legal effects concerning him or her or similarly significantly [affecting] him or her’ within the meaning of Article 22(1) of the GDPR (para. 50).
The CJEU also emphasized that the interpretation of these provisions requires treating the relevant regulatory framework as a whole and ensuring that the rights of the data subject are protected in a situation where scoring is not performed by the institution that issues the final decision (para. 61). Hence, the very fact of scoring should be treated as a „decision” in the light of the provisions of the GDPR and be subject to the provisions on automated decision-making.
This approach is extremely important also in the context of the Polish legal order, in which banking law provides for the right to explain the decision regarding the assessment of the applicant’s creditworthiness. Still, it is formulated in a manner referring to institutions granting credit or loans: Banks and other institutions authorized by law to grant loans at the request of a natural person, legal person, or organizational unit without legal personality, provided it has legal capacity, applying for a loan, provide, in writing, an explanation of their assessment of the applicant’s creditworthiness (Art. 70a). The judgment in SCHUFA case indicates that even if such an assessment were carried out by another company, the data subject should still be able to invoke the provisions of the GDPR regarding automated decision-making.
Requirements for automated decision-making in the GDPR
Moreover, the ruling contains three additional elements that are worth paying attention to. Firstly, the judgment also confirms that Art. 22 of the GDPR should be read as a fundamental prohibition of the use of automated decision-making, to which the GDPR provides three exceptions (para. 52: That provision lays down a prohibition in principle, the infringement of which does not need to be invoked individually by such a person.).
Therefore, to use automated decision-making, it must be possible to invoke one of the exceptions foreseen in the GDPR. However, following the GDPR, even in these situations it is necessary to guarantee appropriate measures to protect the rights and freedoms, and legitimate interests of persons whose data are processed. The CJEU also refers to recital 71, which contains a catalog of examples of such measures: In the light of recital 71 of the GDPR, such measures must include, in particular, the obligation for the controller to use appropriate mathematical or statistical procedures, implement technical and organisational measures appropriate to ensure that the risk of errors is minimised and inaccuracies are corrected, and secure personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and prevent, among other things, discriminatory effects on that person. Those measures include, moreover, at least the right for the data subject to obtain human intervention on the part of the controller, to express his or her point of view, and to challenge the decision taken in his or her regard. (para. 66).
The Court first lists the measures mentioned in the recital (such measures must include, in particular…) and then says that those measures include, moreover, when referring to the measures mentioned in the content of Art. 22. Hence, the second interesting suggestion resulting from the judgment is that – due to the way this paragraph is worded – also the measures are only listed in the recital of the GDPR, and not in the text of Art. 22, should be implemented by an entity that uses automated decision-making.
The third element of the judgment that seems important is that the CJEU emphasizes the need to meet the conditions set out in Arts. 5 and 6 of the GDPR: the provisions regarding the principles of processing and the legal basis of processing, in a situation where Member States introduce in their national laws the possibility of automated decision-making (an exemption to the general prohibition which is foreseen by law of the member state). The CJEU emphasizes that: Thus, if the law of a Member State authorises, under Article 22(2)(b) of the GDPR, the adoption of a decision solely based on automated processing, that processing must comply not only with the conditions set out in the latter provision and in Article 22(4) of that regulation, but also with the requirements set out in Articles 5 and 6 of that regulation (para. 68). The CJEU also emphasized that the legal basis provided for by the law of a Member State cannot ignore the case law of the Court on how given issues should be understood, e.g., it cannot assume in advance that the interests of the entity processing the data outweigh the interests of the data subject (paras. 69-70) and, based on such an assumption, predict in advance the possibility of using automated decision-making.
Conclusion
In the SCHUFA case, the CJEU quite clearly sided with a broad interpretation of the rights of data subjects related to automated decision-making: not only in terms of the interpretation of the content of the requirements regarding what kinds of decisions are covered by the content of Art. 22 of the GDPR, but also concerning its systemic importance of the prohibition foreseen in this provision.
More information:
Post about the case on noyb.eu.
Post about the case on vergassungsblog.de.
Case on the CJEU’s website TSUE.
The translation to English was performed using automated translation and revised by the author.