On February 18, 2026, the European Data Protection Board (“EDPB”) published its Report on Stakeholder Event on Anonymisation and Pseudonymisation of 12 December 2025 (the Report). The Report summarises feedback from a remote stakeholder event convened to inform the EDPB’s ongoing work on Guidelines 01/2025 on Pseudonymisation (version for public consultation available here) and forthcoming guidance on anonymisation. The event gathered input from 115 participants spanning industry, NGOs, academia, law firms, and public sector bodies.

The objective of the Report is to capture stakeholder insights on how the General Data Protection Regulation (“GDPR”) applies to anonymisation and pseudonymisation, particularly following the Court of Justice of the European Union’s (“CJEU”) judgment in EDPS v SRB (C‑413/23 P). (See our previous blog post here.)

Although the Report does not provide definitive guidance, it identifies areas where stakeholders believe further clarification is needed. Some interesting discussion points included:

1. How to conduct contextual, actor‑specific identifiability assessments.

      According to the Report, participants disagreed on how identifiability should be assessed from the perspective of different actors (e.g., controllers, processors, joint controllers, different organizational units in intra-group/company data sharing) and asked for clearer methodological direction. Some reportedly argued for assessing identifiability solely from the processor’s perspective, suggesting data could be considered “anonymous” if the controller applies effective pseudonymisation and confirms that the data cannot be re-identified by the processor. Others maintained that processors process data on behalf of the controller, and therefore the controller’s perspective must always apply. Many participants also asked for clarity on key issues such as whether data processing agreements are required if the receiving party has no means reasonably likely to be used to identify the data subjects, and whether contractual obligations prohibiting re-identification may be enough to solve for the problem of having limited visibility into processors’ identification methods.

      Participants also requested guidance on how to approach complex or high‑risk data types in identifiability assessments and, in particular, called for sector‑specific guidance for research and health data (including clinical trials) and online advertising. In the online advertising context, the Report noted that debate persisted over whether online identifiers constitute “personal data”—with some asking the EDPB to reconsider the “singling out” criterion and others insisting that identifiers enabling targeted action on individuals are inherently personal data.

      2. Specific GDPR provisions causing difficulties.

        The Report states that participants highlighted that questions around identifiability intersect with a range of GDPR provisions, including Articles 6 (legal bases for processing), 28 (controller–processor obligations), 32–34 (security of processing and breach notification), Chapter III (data subject rights), Chapter V (rules on international transfers), and Article 11 (processing where identification is not required). Some participants suggested revisiting the EDPB’s position on Article 11 as expressed in its pseudonymisation guidelines (available here), which they view as overly restrictive. Article 11(2) relieves controllers of certain data subject rights obligations where the controller can demonstrate that it is not in a position to identify the data subject. According to the Report, participants also debated whether a separate legal basis under Article 6 is needed for the transmission of pseudonymised data or the act of anonymisation itself.

        3. How onward transmissions may affect identifiability.

        Stakeholders raised questions about how actual or foreseeable onward disclosures can change identification risks and the implications for initial controllers. According to the Report, many participants agreed that “potential transmissions” should refer to actual or foreseeable transmissions, not purely theoretical ones. Stakeholders agreed on a need for greater legal certainty but had diverging views on how to achieve it – e.g., through a toolbox, a clear list of criteria for identifiability assessment, a list of factors and measures, or a principles-based assessment. The Report noted that the burden of proof issue attracted significant debate: some noted that transmitting parties cannot be expected to know the full capabilities of every entity in the receiving chain, while others emphasised the controller’s accountability obligations under the GDPR.

        4. Assessing “means reasonably likely to be used” (“MRLTBU”) to identify individuals.

        Stakeholders sought guidance on how to evaluate MRLTBU, identifying three complementary categories of measures: legal (including contractual), organisational, and technical. The Report notes that several participants drew attention to helpful guidance contained in Recital 26 GDPR. Some participants also stressed that the assessment should factor in, for example, auxiliary datasets, feasibility, cost, and technological capability. Views diverged on whether purely hypothetical threats—including illegal acts such as hacking—should be included in the assessment. According to the Report, stakeholders also asked for more detailed guidance on the use of technical measures, such as concrete examples of privacy enhancing technologies (“PETs”) that can be relied upon. Participants also requested guidance on appropriate measures for key management (e.g., for encryption or tokenisation) and resistance to quantum computing.

        5. When pseudonymised data should be considered personal data for a given recipient.

        Stakeholders highlighted difficulties in identifying all potential recipients and understanding their re‑identification capabilities, particularly in multi‑layered environments such as research consortia or online advertising. According to the Report, several participants emphasised that a controller’s lack of knowledge about recipient capabilities should not serve as a basis for overlooking risk (i.e., data should be presumed personal for the recipient unless anonymity is proven and documented). Participants also discussed complexities related to the data itself, noting that certain types of data—such as genetic or highly technical datasets—may carry different identifiability risks depending on the auxiliary information available to each recipient. In addition, participants expressed varying views on which technical and organisational measures a pseudonymising controller can implement to reduce re-identification risk. Stakeholders noted that contractual obligations are useful but may be insufficient where parties have different technical capabilities or levels of influence. The importance of robust audit frameworks, including third-party audits and regular re-evaluation every 2-3 years, was also emphasized.

        Next Steps

        The Report outlines the complexities of anonymisation and pseudonymisation under the GDPR and relevant case law. Although the Report does not provide definitive guidance, it identifies the issues that are likely to be addressed in forthcoming EDPB guidance. Both sets of guidelines—on anonymisation and pseudonymisation— are listed in the EDPB’s Work Programme 2026-2027, adopted February 11, 2026. (Note, however, that both guidelines were listed in the EDPB’s Work Programmes in previous years as well.) During the event, a representative of the Italian Garante also invited stakeholders to come forward with technical examples of what effective pseudonymisation looks like, so there may be additional guidance on that front in the future.

        Notably, the definition of “personal data” under the GDPR continues to be debated as part of the European Commission’s digital omnibus proposal—so the discussion will continue (and amplify) in the coming months.

        Photo of Sam Jungyun Choi Sam Jungyun Choi

        Recognized by Law.com International as a Rising Star (2023), Sam Jungyun Choi is an associate in the technology regulatory group in Brussels. She advises leading multinationals on European and UK data protection law and new regulations and policy relating to innovative technologies, such…

        Recognized by Law.com International as a Rising Star (2023), Sam Jungyun Choi is an associate in the technology regulatory group in Brussels. She advises leading multinationals on European and UK data protection law and new regulations and policy relating to innovative technologies, such as AI, digital health, and autonomous vehicles.

        Sam is an expert on the EU General Data Protection Regulation (GDPR) and the UK Data Protection Act, having advised on these laws since they started to apply. In recent years, her work has evolved to include advising companies on new data and digital laws in the EU, including the AI Act, Data Act and the Digital Services Act.

        Sam’s practice includes advising on regulatory, compliance and policy issues that affect leading companies in the technology, life sciences and gaming companies on laws relating to privacy and data protection, digital services and AI. She advises clients on designing of new products and services, preparing privacy documentation, and developing data and AI governance programs. She also advises clients on matters relating to children’s privacy and policy initiatives relating to online safety.

        Photo of Jadzia Pierce Jadzia Pierce

        Jadzia Pierce advises clients developing and deploying technology on a range of regulatory matters, including the intersection of AI governance and data protection. Jadzia draws on her experience in senior in house leadership roles and extensive, hands on engagement with regulators worldwide. Prior…

        Jadzia Pierce advises clients developing and deploying technology on a range of regulatory matters, including the intersection of AI governance and data protection. Jadzia draws on her experience in senior in house leadership roles and extensive, hands on engagement with regulators worldwide. Prior to rejoining Covington in 2026, Jadzia served as Global Data Protection Officer at Microsoft, where she oversaw and advised on the company’s GDPR/UK GDPR program and acted as a primary point of contact for supervisory authorities on matters including AI, children’s data, advertising, and data subject rights.

        Jadzia previously was Director of Microsoft’s Global Privacy Policy function and served as Associate General Counsel for Cybersecurity at McKinsey & Company. She began her career at Covington, advising Fortune 100 companies on privacy, cybersecurity, incident preparedness and response, investigations, and data driven transactions.

        At Covington, Jadzia helps clients operationalize defensible, scalable approaches to AI enabled products and services, aligning privacy and security obligations with rapidly evolving regulatory frameworks across jurisdictions—with a particular focus on anticipating enforcement trends and navigating inter regulator dynamics.

        Photo of Kristof Van Quathem Kristof Van Quathem

        Kristof Van Quathem advises clients on information technology matters and policy, with a focus on data protection, cybercrime and various EU data-related initiatives, such as the Data Act, the AI Act and EHDS.

        Kristof has been specializing in this area for over twenty…

        Kristof Van Quathem advises clients on information technology matters and policy, with a focus on data protection, cybercrime and various EU data-related initiatives, such as the Data Act, the AI Act and EHDS.

        Kristof has been specializing in this area for over twenty years and developed particular experience in the life science and information technology sectors. He counsels clients on government affairs strategies concerning EU lawmaking and their compliance with applicable regulatory frameworks, and has represented clients in non-contentious and contentious matters before data protection authorities, national courts and the Court of the Justice of the EU.

        Kristof is admitted to practice in Belgium.