In a recent ruling, the Court of Justice of the European Union (CJEU) addressed the right of access to information in cases of automated decision-making and profiling under the GDPR. The case, CK v Magistrat der Stadt Wien and Dun & Bradstreet Austria GmbH (Case C-203/22), examined the extent to which individuals are entitled to receive meaningful information about the logic behind automated decision-making processes, as well as the balance between transparency and the protection of trade secrets.
You can find the full case here.
Facts
The case originated from a dispute in Austria, where CK, an individual, was denied a mobile phone contract due to an unfavorable automated credit assessment conducted by Dun & Bradstreet Austria GmbH (D&B). CK requested access to meaningful information about the logic involved in the profiling process under Article 15(1)(h) GDPR, which requires data controllers to provide individuals with such details. However, D&B refused to disclose the full methodology, citing trade secrets and business confidentiality.
The Austrian Data Protection Authority ruled in favor of CK, ordering D&B to disclose more details about the profiling logic. D&B challenged this decision before the Austrian courts, which ultimately referred several questions to the CJEU concerning the interpretation of GDPR provisions and the interaction with trade secret protection under Directive (EU) 2016/943.
Questions
The referring court sought clarification on the following key issues:
- Scope of “meaningful information” under Article 15(1)(h) GDPR:
- Whether this includes an exhaustive explanation of the procedure and principles applied in the automated decision-making process, including the factors used to generate a credit score.
- Whether this includes disclosure of the personal data used, the mathematical formula, the specific value attributed for each of the factors concerned and the influence of each factor.
- Interplay between Article 15(1)(h) GDPR and Article 22 GDPR:
- Whether the right to access meaningful information is linked to the right to express one’s view and contest an automated decision under Article 22(3) GDPR.
- Verification of the accuracy of provided information:
- Whether GDPR guarantees the right to verify the accuracy of information provided in response to a data subject’s request.
- Whether third-party data used in the profiling process must be disclosed in an anonymized form to ensure accuracy.
- Balancing data subject rights and trade secret protection:
- Whether trade secrets can limit the right of access under Article 15(1)(h) GDPR.
- Whether trade secrets can be disclosed only to a court or authority for verification rather than to the data subject.
- Compatibility of Austrian law with GDPR:
- Whether national provisions that categorically restrict access to information based on trade secret claims comply with GDPR.
Arguments of the CJEU
Question 1
Extent of “meaningful information” under GDPR
The Court ruled that Article 15(1)(h) GDPR requires controllers to provide information explaining the “procedure and principles actually applied” in the automated decision-making process. However, this does not necessitate full disclosure of the algorithm or formula but should include relevant details enabling individuals to understand the profiling logic. The information must be provided in a “concise, transparent, intelligible, and easily accessible form.”
Question 2
Link to Article 22 GDPR
The right to meaningful information is directly linked to the rights under Article 22(3) GDPR, which allows individuals to challenge an automated decision. Without adequate information, a data subject would be unable to effectively contest the decision or express their viewpoint.
Question 3
Right to verify accuracy
The Court held that data subjects must be able to verify whether the automated decision-making process used accurate data. While GDPR does not mandate disclosure of third-party data, controllers must ensure transparency regarding their own processing. The CJEU emphasized that without an effective way to verify the accuracy of data, the right of access under GDPR would be meaningless. In cases where the results of profiling significantly differ from the personal data provided to the data subject, companies must explain how these discrepancies arise. The ruling also clarified that where third-party data plays a crucial role in profiling, supervisory authorities or courts may be involved to ensure fairness and transparency without violating the rights of those third parties. The aim is to balance data protection rights while maintaining the integrity of business operations. Additionally, the Court stressed that simply providing generic explanations or citing trade secrets is insufficient. Organizations must demonstrate how the profiling process led to the decision and ensure that any errors can be identified and rectified.
Question 4
Trade secret limitations and court intervention
The Court recognized that trade secrets are a legitimate interest but must be balanced against data subject rights. A blanket refusal to provide information based on trade secrecy is not permitted under GDPR. If necessary, trade secret-protected information can be disclosed to a supervisory authority or court, which must then determine whether the balance of interests justifies non-disclosure to the data subject. The CJEU clarified that this does not mean companies must reveal commercially sensitive algorithms but that they should provide a sufficient explanation for individuals to understand the decision-making process and contest unfair outcomes.
Question 5
Austrian law and GDPR compatibility
The Court ruled that the Austrian provision broadly restricting access based on trade secret claims is incompatible with GDPR. Each case must be assessed individually rather than imposing a general exclusion. The CJEU reiterated that national laws cannot undermine the fundamental rights enshrined in GDPR and that an absolute exclusion of access rights in the name of trade secrecy is not permissible.
Conclusion
In my opinion, the CJEU completely missed an opportunity to set a clear, enforceable standard for what “meaningful information” actually means in the context of scoring companies. Instead of adopting the expert-backed criteria suggested by the referring court—like explicitly disclosing key factors, weights, and data points used in scoring—the Court stuck to the broad GDPR principle of transparency, which leaves a lot of room for interpretation. To me, this is a problem because it shifts the burden onto DPAs and courts, who now have to spend time and resources figuring out what level of explanation is enough. Meanwhile, companies will likely take advantage of this vagueness to provide the bare minimum, delaying compliance and avoiding real transparency. I also think the CJEU’s decision to rely purely on legal principles without bringing in technical expertise shows a disconnect from how things actually work in practice. Scoring algorithms and automated decision-making aren’t just abstract legal concepts—they’re highly technical systems that require a real understanding of data science and machine learning. Without expert input, we end up with rulings that sound reasonable on paper but don’t hold up in the real world. Moving forward, courts handling cases involving algorithmic decision-making should make technical expertise mandatory, much like how competition law relies on economic analysis. Without this, we’ll continue to see decisions that seem legally sound but fail to offer real protection in practice.