We use cookies to collect and analyse information on site performance and usage to improve and customise your experience, where applicable. View our Cookies Policy. Click Accept and continue to use our website or Manage to review and update your preferences.


Two can keep a secret

07 Apr 2025 data law Print

Two can keep a secret

The European Data Protection Board has published its opinion on the processing of personal data in the context of AI models. Elaine Morrissey deletes her cookies 

The European Data Protection Board (EDPB) has published a helpful and much needed opinion on the processing of personal data and AI models.

At the request of the Data Protection Commission, the board issued Opinion 28/2024 in December. The request was submitted by the commission via the EU GDPR consistency mechanism with a view to seeking EU-wide regulatory harmonisation.

The request and the opinion comprise three parts in relation to the development and deployment of AI models:

1) When and how can an AI model be considered anonymous?

2) How can controllers demonstrate the appropriateness of legitimate interest as a legal basis in the development and deployment phases?

3) What are the consequences of unlawful processing of personal data in the development phase of an AI model on the subsequent processing or operation of the AI model? 

While the opinion is written from the supervisory authorities’ perspective (that is, what to assess, consider, and corrective measures), it is expected to be well-thumbed by all relevant stakeholders, providing much needed direction for this complex area.

Tell that devil

The opinion is a good reminder of some key elements of privacy compliance faced by those advising in the area daily, including the principles, legitimate interest test, and data-subject rights.

For the first part, the opinion reminds us that any assessment of assertion of anonymity needs to be considered on a case-by-case basis. Many privacy practitioners will be familiar with assertions that data is anonymous and then, upon checking the data, clearly see that it is far from anonymous.

To assess the anonymity of an AI model, there are two elements to consider – within the context of ‘reasonable means’:

1) That personal data related to the training data cannot be extracted out of the model, and

2) That any output produced when querying the model does not relate to the data subjects whose personal data was used to train the model.

When assessing whether the above two conditions have been met, consideration should be given to the Data Protection Working Party Opinion 05/2014 on anonymisation techniques.

The EDPB considers that AI models are very likely to require a thorough evaluation of the risks of identification.

The assessment should consider “all the means reasonably likely to be used” by the controller or another party to identify individuals, and whether an unauthorised party can reasonably be considered to be able to gain access to, or process the data in question.

The little things

The second part of the opinion deals with the appropriateness of legitimate interest as a legal basis for processing personal data in the development and deployment of AI models.

In dealing with this question, the opinion highlights the importance of GDPR principles: accountability, lawfulness, fairness and transparency, purpose limitation, and data minimisation.

It reminds us that, when relying on legitimate interest, article 21 of the GDPR (the right to object) applies. To rely on legitimate interest, the ‘three-step test’ must be met.

As a first step for an interest to be legitimate, it must be:

1) Lawful,

2) Clearly and precisely articulated, and 3) Real and present – not speculative.

I asked for water

The second step is the ‘necessity test’ – whether the processing of personal data is necessary for the legitimate interest pursued. In short, is this data necessary (categories, volume); can less (volume, categories) data achieve the same result?

The third step is the ‘balancing test/ exercise’ – this requires analysis of the rights and interests of the data subjects and the controller or a third party.

The opinion highlights consideration of the impact on data subjects and the reasonable expectation of the data subjects.

Mitigating measures are also highlighted, including technical measures – for example, pseudonymisation, masking data, and measures to facilitate the exercise of individuals’ rights.

Bad things

The third part of the opinion deals with where an AI model is developed using personal data that was processed unlawfully.

The opinion addresses three scenarios. The theme throughout the scenarios is that analysis, on a case-by-case basis, is required.

In the first scenario, which involves the same controller at the development and deployment stage, the unlawfulness of the processing in the development phases may have an impact on the lawfulness of the subsequent processing.

In the second scenario, which involves a different controller at the deployment stage, consideration will be given to whether the controller deploying the model conducted an appropriate assessment.

The third scenario is where the model is anonymised before further processing by the same controller or another controller. If it can be shown that the subsequent operation of the AI model does not entail the processing of personal data, the EDBP considers that the GDPR should not apply to that element.

This links back to the first part of the opinion – an assessment of the anonymity of an AI model.

Why don’t you do right?

The thread throughout the opinion is that supervisory authorities need to consider matters on a case-by-case basis, and that controllers (developers, or deployers) need to have their paperwork in order.

Those practising and/or interested in this area will find the opinion helpful for the specific topic, but also as a good reminder of privacy principles and appropriate assessments, for example, legitimate-interest assessment and ensuring appropriate third-party assessments.

Reading the opinion in full is recommended.

Elaine Morrissey is chair of the Law Society’s IP and Data Protection Law Committee. 

Elaine Morrissey
Elaine Morrissey is a member of the Law Society’s Intellectual Property and Data Protection Law Committee and is assistant global DPO at ICON.

Copyright © 2025 Law Society Gazette. The Law Society is not responsible for the content of external sites – see our Privacy Policy.