How to Explain Automated Decisions: Recent CJEU Decision and CPPA Rulemaking Offer Insight into ADM Explainability
Authors
Myriah V. Jaworski , Chirag H. Patel , Ali Bloom
Automated decision-making, or ADM, is used for a wide range of use cases that impact individuals — from processing insurance claims and credit scoring, to ranking job candidates and offering personalized pricing or targeted advertisements.
Businesses subject to privacy laws, whether the European Union’s General Data Protection Regulation (GDPR) or United States privacy rules like the California Privacy Protection Act (CCPA/CPRA), may be required to disclose to individuals the extent to which ADM tools are used to make decisions about them. In certain instances, individuals must be given information about exactly how those decisions were made.
But what does ADM explainability mean in practice? A recent judgement of the Court of Justice for the European Union (CJEU) and recent rulemaking from the California Privacy Protection Agency (CPPA) offer some insight into the ADM disclosure requirement(s).
I. CJEU’s Case C-203/22 Dunn & Bradstreet Austria Decision
The GDPR regulates ADM in three express ways:
- Article 22 gives individuals the right not to be subject to a decision based solely on ADM (including profiling) that produces legal or similarly significant effects on the individual.
- Articles 13(2)(f) and 14(2)(g) require the controller to provide individuals with “information about the existence of automated decision-making, including profiling … [and] … meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject” (i.e., in the controller’s privacy notices).
- Article 15(1)(h) entitles individuals to obtain from the controller, in the context of an access request, information about the existence of ADM (including profiling) and “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject”.
Case C-203/22 Dun & Bradstreet Austria concerned a response to an individual’s Article 15 GDPR access request. Specifically, the CJEU was asked: (1) what “meaningful information about the logic involved” in ADM was required to be disclosed, and (2) whether the controller is required to provide to the requester with information that could constitute an organization’s trade secrets (i.e., relevant algorithm(s), logic decisions or copyright-protected software).
The CJEU ruled as follows:
- If a decision is: (i) based solely on automated processing and (ii) that decision significantly affects the individual; then the individual has the right to obtain an explanation of the decision.
- This right entitles the individual to an “explanation of the procedure and principles actually applied in order to use, by automated means, the [individual’s personal data] with a view to obtaining a specific result.”
- This requirement cannot be satisfied either by “the mere communication of a complex mathematical formula”, such as an algorithm, or “by the detailed description of all the steps in automated decision-making, since none of those would constitute a sufficiently concise and intelligible explanation”.
- The information must be provided in a concise, transparent, intelligible and easily accessible form – that is to say, in a way that the individual can understand. Moreover, the complexity of the operations carried out in the context of ADM does not relieve the controller of its duty to provide an explanation about the processing.
The CJEU noted, however, that the right to personal data is not absolute and must be balanced against other fundamental rights, including the rights of third parties and the controller’s trade secrets and intellectual property.
- Where the information to be provided to an individual under Article 15(1)(h) of the GDPR is likely to result in an infringement of the rights and freedoms of others, including infringement of the controller’s trade secrets, the controller must disclose that information to the competent supervisory authority or court, which must balance the competing rights and interests with a view to determining the extent of the individual’s right of access.
- Where that is the case, the controller must nevertheless explain to the individual the logic behind the ADM — unless such logic is, itself, a trade secret.
II. California Privacy Rights Act ADMT Regulations
In California, the CPPA has issued a series of proposed regulations that govern ADM tools.
While the Agency’s authority to issue these regulations is being challenged by the California legislature as beyond its statutory authority , those regulations are pending final approval by the Office of Administrative Law (OAL), which has up to 30 business days to review and approve the CPPA’s final rulemaking package. Once the OAL grants approval, the effective date of the Proposed Regulations will be determined in accordance with California Government Code. § 11343.4(b)(3).
With respect to ADM, the proposed regulations require:
Under the new regulations, California consumers have the right to be notified about the use of ADMT. Businesses are required to provide a “Pre-use Notice” before processing a consumer’s personal information, which includes using ADMT for a “significant decision,” “extensive profiling,” or for training ADMT capable of (a) making a significant decision, (b) establishing individual identity, (c) performing physical or biological identification or profiling, or (d) generating a “deepfake.”
Consumers also have the right to opt out of ADMT. If a consumer submits a “opt-out request” before a business begins processing their data, the business must not proceed with that processing. If the consumer submits the opt-out request after processing has begun and no applicable exception to the opt-out exists, the business must stop processing the consumer’s data as soon as possible, but no later than 15 business days from receiving the request (in line with the opt-out of sale/sharing timeline).
Additionally, under the proposed regulations, California consumers have the right to access information about how a business uses ADMT on their data. Organizations must give consumers an easy way to request this information.
When responding to access requests, organizations must provide details like the reason for using ADMT, the output of the ADMT regarding the consumer, and a description of how the business used the output to make a decision.
Access request responses should also include information on how the consumer can exercise their CCPA rights, such as filing complaints or requesting the deletion of their data.
Notification of Adverse Significant Decisions
If a business uses ADMT to make a significant decision that negatively affects a consumer—for example, by leading to job termination—the business must send a special notice to the consumer about their access rights regarding this decision.
The notice must include:
- An explanation that the business used ADMT to make an adverse decision.
- Notification that the business cannot retaliate against the consumer for exercising their CCPA rights.
- A description of how the consumer can access additional information about how ADMT was used.
- Information on how to appeal the decision, if applicable.
III. Colorado
The Colorado AI Act requires developers and deployers of “high-risk AI systems” to exercise reasonable care in order to protect consumers from known or foreseeable risks of “algorithmic discrimination.” This responsibility extends across eight specified contexts in which these systems are used. Set to take effect on February 1, 2026, the Act introduces transparency and reporting requirements for both deployers and developers.
For instance, deployers of high-risk AI systems will be required to conduct annual impact assessments. Additionally, both deployers and developers must disclose any known or foreseeable risks of algorithmic discrimination related to the intended use of these systems.
IV. Take Aways to ADM Explainability
As businesses increasingly incorporate ADM systems, ensuring transparency in their processes is critical, both from a legal and ethical standpoint. The recent rulings by the CJEU and proposed regulations from the CPPA offer important insights into what explainability means in practice. While organizations are not required under Article 15 of the GDPR to disclose trade secrets, they must balance their legal obligations with the protection of their intellectual property. This balance ensures that individuals are not deprived of their right to understand how their personal data is being used in significant decisions. However, the CJEU makes clear that an organization which chooses not to provide such information to the requesting individual must submit it to a supervisory authority or court for review.
As outlined above, under the GDPR, businesses must offer clear and concise explanations to individuals about the existence of automated decision-making, including the logic behind it, the significance, and the potential consequences of such decisions. This is particularly critical when decisions are made that have a significant impact on individuals, such as in the cases of profiling or other automated processes that affect legal rights. Importantly, businesses are encouraged to provide this information in a manner that is accessible and understandable to the general public. Although a generalized privacy notice might not meet all the specifics of an access request, businesses can use these as a foundation for compliance, ensuring they communicate transparently from the outset.
From a practical perspective, businesses can significantly reduce the likelihood of disputes or complaints by being proactive in providing transparent information before a decision is made. In addition to drafting clear privacy notices, adopting creative methods to ensure that individuals actually engage with this information (like using pop-ups or just-in-time notices) can help fulfill legal obligations while building trust with consumers.
However, challenges will inevitably arise, especially when individuals feel negatively impacted by ADM decisions. When this happens, businesses must be prepared to respond to access requests under GDPR and, in the case of California, comply with the CPPA’s provisions on opt-out rights and transparency. While robust transparency can reduce the volume of such requests, businesses should be ready to handle them efficiently, ensuring that individuals know how to challenge or appeal adverse decisions.
Ultimately, transparency in ADM is about more than compliance, it’s about building trust with individuals whose data is being used in significant decisions. By focusing on clarity, accessibility, and proactive communication, businesses can navigate the complexities of ADM explainability and foster more positive relationships with consumers.
This publication is intended for general informational purposes only and does not constitute legal advice or a solicitation to provide legal services. The information in this publication is not intended to create, and receipt of it does not constitute, a lawyer-client relationship. Readers should not act upon this information without seeking professional legal counsel. The views and opinions expressed herein represent those of the individual author only and are not necessarily the views of Clark Hill PLC. Although we attempt to ensure that postings on our website are complete, accurate, and up to date, we assume no responsibility for their completeness, accuracy, or timeliness.