Personalized advertising and pricing are increasingly common online practices, and prompt discussions about fairness and consumer rights in the EU.  This post examines how these practices are regulated under EU consumer protection law, and what we anticipate from the forthcoming Digital Fairness Act (DFA).  We also consider how data protection rules—such as the GDPR—interact with consumer protection laws.

This is the third post in our series on the DFA—a draft EU law currently being prepared by the European Commission and expected to be published in mid-2026.  Previous posts covered influencer marketing and AI chatbots in consumer interactions.

EU Reports Highlight Risks of Personalized Ads and Pricing

The European Commission’s Fitness Check report, which lays the groundwork for the upcoming DFA, found that personalized advertising and pricing—where ads or prices are tailored to individuals based on their personal data—can be problematic, especially when companies lack transparency  or use data to exploit consumer vulnerabilities.

The EU’s Legal Framework for Personalized Advertising and Pricing

The EU regulates personalized advertising and pricing through a range of regulations designed to ensure transparency, fairness, and respect for individual rights.  Consumer protection laws, in particular, play a central role in preventing unfair, misleading, or manipulative practices.  Broader data protection, privacy, and digital market regulations—such as the GDPR, e-Privacy Directive, Digital Services Act (DSA), and Digital Markets Act (DMA)—further restrict personalization in digital services.

Here is an overview of the main legal instruments:

  • Consumer Rights Directive (CRD).  The CRD requires traders to inform consumers if a price has been personalized based on automated decision-making.  This obligation aims to enhance transparency and support informed consumer choices.  However, businesses are not required to disclose the specific parameters or criteria used.  In other words, consumers must be told that pricing is personalized, but the CRD does not grant them access to details on how or why the price was personalized.
  • Unfair Commercial Practices Directive (UCPD).  The UCPD is the EU’s primary tool for addressing unfair, misleading, or aggressive commercial practices.  It prohibits traders from exploiting consumer vulnerabilities or distorting consumer decision-making and applies to both general and targeted marketing.  Traders must provide clear and meaningful information about personalized pricing and advertising to ensure transparency and prevent deception.  The UCPD covers manipulative or misleading personalization techniques—such as profiling or “dark patterns”—that impair consumers’ informed choices or unfairly exploit vulnerabilities.  However, it does not explicitly define “dark patterns,” which has resulted in legal uncertainty on which practices are prohibited.
  • Digital Services Act.  The DSA regulates personalized advertising and recommender systems on intermediary services (e.g., online platforms, like social media and e-commerce sites), prohibiting targeted ads based on profiling minors or using sensitive personal data.  Platforms must clearly identify personalized ads, disclose advertiser identities and targeting parameters, and explain key factors behind their recommender systems, while allowing users to adjust settings.  The DSA also bans manipulative “dark patterns” that impair users’ informed choices.  Very Large Online Platforms / Search Engines must maintain advertising repositories and offer at least one non-profiling recommender system option.
  • Digital Markets Act.  The DMA imposes obligations on designated “gatekeepers”—large online platforms providing core platform services—to limit their use of personal data for targeted advertising without obtaining user consent.  The DMA prohibits unfair practices such as self-preferencing (favoring the gatekeeper’s own products or services over competitors’) and combining personal data across services without explicit user consent.
  • e-Privacy Directive.  This directive strictly regulates privacy in electronic communications, primarily by requiring explicit, informed user or subscriber consent for cookies and other tracking technologies used in personalization (among other things), while exempting those strictly necessary for website functionality.
  • General Data Protection Regulation.  The GDPR regulates the processing of personal data for targeted advertising, requiring a valid legal basis under Article 6—typically consent, though legitimate interests may apply depending on the context and safeguards.  The GDPR also grants data subjects the right to withdraw consent at any time, and the right to object at any time to the processing of their personal data for direct marketing purposes, including profiling to the extent it relates to such marketing.  Moreover, the GDPR mandates informing data subjects about automated decision-making, including profiling, with meaningful, clear, and intelligible explanations of the logic involved and the significance and consequences of such processing, but only where such decisions produce legal effects or similarly significant effects on the individual.  In some cases, this also imposes a requirement to obtain consent.  (see our previous blog post).

In addition to the above, EU anti-discrimination laws, grounded in the EU Charter of Fundamental Rights and directives such as the Racial Equality Directive and the Gender Goods and Services Directive, prohibit using sensitive personal characteristics—like race, gender, ethnicity, religion, disability, or sexual orientation—in pricing decisions.

Potential Regulatory Approaches of the Digital Fairness Act

The recent EU Fitness Check suggests that there remain lingering concerns about personalized advertising and pricing.  While we cannot anticipate how the DFA will ultimately address these concerns, we anticipate the DFA will complement the existing regulatory framework by introducing stronger rules to prevent manipulative and unfair commercial practices in digital markets, including those leveraging AI and behavioral profiling.  Based on the Fitness Check report, the DFA could, for example:

  • give consumers the ability to opt out of personalized ads or offers, in addition to any requirements to obtain consent in certain contexts under existing laws such as the GDPR and the ePrivacy Directive;
  • ban ads targeting minors or using sensitive or vulnerability-related data, in a manner similar to the DSA, but extended to a broader range of online services;
  • add to the UCPD blacklist those commercial practices that use psychographic profiling or exploit personal vulnerabilities, including users suffering from emotional distress, exhaustion, grief, sorrow, physical pain, or under the influence of medication; and
  • restrict price personalization, especially when aimed at children or vulnerable consumers when based on behavioral predictions (e.g., willingness to pay), except where such pricing is fully transparent, limited to strictly necessary data, or based on verifiable demographic characteristics.

These proposals reflect concerns that existing rules, including the GDPR and DSA, do not fully cover the breadth of consumer risks associated with personalized advertising and pricing practices.

*             *             *

Covington & Burling regularly advises companies on all aspects of EU consumer law and has deep expertise at the intersection of consumer protection law with other key EU regulatory frameworks, including the GDPR and the DSA.  We closely monitor developments in new consumer legislation, such as the forthcoming DFA, and provide comprehensive guidance to help clients navigate the complex regulatory landscape.  We look forward to assisting you with your EU consumer compliance needs.

Photo of Kristof Van Quathem Kristof Van Quathem

Kristof Van Quathem advises clients on information technology matters and policy, with a focus on data protection, cybercrime and various EU data-related initiatives, such as the Data Act, the AI Act and EHDS.

Kristof has been specializing in this area for over twenty…

Kristof Van Quathem advises clients on information technology matters and policy, with a focus on data protection, cybercrime and various EU data-related initiatives, such as the Data Act, the AI Act and EHDS.

Kristof has been specializing in this area for over twenty years and developed particular experience in the life science and information technology sectors. He counsels clients on government affairs strategies concerning EU lawmaking and their compliance with applicable regulatory frameworks, and has represented clients in non-contentious and contentious matters before data protection authorities, national courts and the Court of the Justice of the EU.

Kristof is admitted to practice in Belgium.

Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as the IAPP’s European Advisory Board, Privacy International and the European security agency, ENISA.

Photo of Anna Sophia Oberschelp de Meneses Anna Sophia Oberschelp de Meneses

I assist companies in navigating EU laws on technology, with a focus on data protection, cybersecurity, and consumer protection. My goal is to make complex regulations, such as the GDPR, AI Act, Unfair Commercial Practices Directive, and Digital Services Act, more accessible and…

I assist companies in navigating EU laws on technology, with a focus on data protection, cybersecurity, and consumer protection. My goal is to make complex regulations, such as the GDPR, AI Act, Unfair Commercial Practices Directive, and Digital Services Act, more accessible and relevant to everyday business operations.

Regarding data protection and privacy, I guide businesses on GDPR, ePrivacy Directive, and EU marketing laws, covering topics like international data transfers and privacy-focused marketing. Regarding cybersecurity, I help with risk assessments, incident response planning, and staying informed about regulations such as NIS2 and the Cyber Resilience Act. Regarding consumer protection, I assist companies in ensuring their terms are enforceable, their online platforms clearly provide required information, and their practices comply with rules against banned commercial activities.

Fluent in several languages and experienced in international contexts, I am committed to integrating compliance smoothly into business operations, enabling companies to succeed in the dynamic digital environment.

Matsumoto Ryoko

Ryoko Matsumoto is a global visiting lawyer who attended Kyoto University, Kyoto University Law School, and Stanford Law School.