Personalized advertising and pricing are increasingly common online practices, and prompt discussions about fairness and consumer rights in the EU. This post examines how these practices are regulated under EU consumer protection law, and what we anticipate from the forthcoming Digital Fairness Act (DFA). We also consider how data protection rules—such as the GDPR—interact with consumer protection laws.
This is the third post in our series on the DFA—a draft EU law currently being prepared by the European Commission and expected to be published in mid-2026. Previous posts covered influencer marketing and AI chatbots in consumer interactions.
EU Reports Highlight Risks of Personalized Ads and Pricing
The European Commission’s Fitness Check report, which lays the groundwork for the upcoming DFA, found that personalized advertising and pricing—where ads or prices are tailored to individuals based on their personal data—can be problematic, especially when companies lack transparency or use data to exploit consumer vulnerabilities.
The EU’s Legal Framework for Personalized Advertising and Pricing
The EU regulates personalized advertising and pricing through a range of regulations designed to ensure transparency, fairness, and respect for individual rights. Consumer protection laws, in particular, play a central role in preventing unfair, misleading, or manipulative practices. Broader data protection, privacy, and digital market regulations—such as the GDPR, e-Privacy Directive, Digital Services Act (DSA), and Digital Markets Act (DMA)—further restrict personalization in digital services.
Here is an overview of the main legal instruments:
- Consumer Rights Directive (CRD). The CRD requires traders to inform consumers if a price has been personalized based on automated decision-making. This obligation aims to enhance transparency and support informed consumer choices. However, businesses are not required to disclose the specific parameters or criteria used. In other words, consumers must be told that pricing is personalized, but the CRD does not grant them access to details on how or why the price was personalized.
- Unfair Commercial Practices Directive (UCPD). The UCPD is the EU’s primary tool for addressing unfair, misleading, or aggressive commercial practices. It prohibits traders from exploiting consumer vulnerabilities or distorting consumer decision-making and applies to both general and targeted marketing. Traders must provide clear and meaningful information about personalized pricing and advertising to ensure transparency and prevent deception. The UCPD covers manipulative or misleading personalization techniques—such as profiling or “dark patterns”—that impair consumers’ informed choices or unfairly exploit vulnerabilities. However, it does not explicitly define “dark patterns,” which has resulted in legal uncertainty on which practices are prohibited.
- Digital Services Act. The DSA regulates personalized advertising and recommender systems on intermediary services (e.g., online platforms, like social media and e-commerce sites), prohibiting targeted ads based on profiling minors or using sensitive personal data. Platforms must clearly identify personalized ads, disclose advertiser identities and targeting parameters, and explain key factors behind their recommender systems, while allowing users to adjust settings. The DSA also bans manipulative “dark patterns” that impair users’ informed choices. Very Large Online Platforms / Search Engines must maintain advertising repositories and offer at least one non-profiling recommender system option.
- Digital Markets Act. The DMA imposes obligations on designated “gatekeepers”—large online platforms providing core platform services—to limit their use of personal data for targeted advertising without obtaining user consent. The DMA prohibits unfair practices such as self-preferencing (favoring the gatekeeper’s own products or services over competitors’) and combining personal data across services without explicit user consent.
- e-Privacy Directive. This directive strictly regulates privacy in electronic communications, primarily by requiring explicit, informed user or subscriber consent for cookies and other tracking technologies used in personalization (among other things), while exempting those strictly necessary for website functionality.
- General Data Protection Regulation. The GDPR regulates the processing of personal data for targeted advertising, requiring a valid legal basis under Article 6—typically consent, though legitimate interests may apply depending on the context and safeguards. The GDPR also grants data subjects the right to withdraw consent at any time, and the right to object at any time to the processing of their personal data for direct marketing purposes, including profiling to the extent it relates to such marketing. Moreover, the GDPR mandates informing data subjects about automated decision-making, including profiling, with meaningful, clear, and intelligible explanations of the logic involved and the significance and consequences of such processing, but only where such decisions produce legal effects or similarly significant effects on the individual. In some cases, this also imposes a requirement to obtain consent. (see our previous blog post).
In addition to the above, EU anti-discrimination laws, grounded in the EU Charter of Fundamental Rights and directives such as the Racial Equality Directive and the Gender Goods and Services Directive, prohibit using sensitive personal characteristics—like race, gender, ethnicity, religion, disability, or sexual orientation—in pricing decisions.
Potential Regulatory Approaches of the Digital Fairness Act
The recent EU Fitness Check suggests that there remain lingering concerns about personalized advertising and pricing. While we cannot anticipate how the DFA will ultimately address these concerns, we anticipate the DFA will complement the existing regulatory framework by introducing stronger rules to prevent manipulative and unfair commercial practices in digital markets, including those leveraging AI and behavioral profiling. Based on the Fitness Check report, the DFA could, for example:
- give consumers the ability to opt out of personalized ads or offers, in addition to any requirements to obtain consent in certain contexts under existing laws such as the GDPR and the ePrivacy Directive;
- ban ads targeting minors or using sensitive or vulnerability-related data, in a manner similar to the DSA, but extended to a broader range of online services;
- add to the UCPD blacklist those commercial practices that use psychographic profiling or exploit personal vulnerabilities, including users suffering from emotional distress, exhaustion, grief, sorrow, physical pain, or under the influence of medication; and
- restrict price personalization, especially when aimed at children or vulnerable consumers when based on behavioral predictions (e.g., willingness to pay), except where such pricing is fully transparent, limited to strictly necessary data, or based on verifiable demographic characteristics.
These proposals reflect concerns that existing rules, including the GDPR and DSA, do not fully cover the breadth of consumer risks associated with personalized advertising and pricing practices.
* * *
Covington & Burling regularly advises companies on all aspects of EU consumer law and has deep expertise at the intersection of consumer protection law with other key EU regulatory frameworks, including the GDPR and the DSA. We closely monitor developments in new consumer legislation, such as the forthcoming DFA, and provide comprehensive guidance to help clients navigate the complex regulatory landscape. We look forward to assisting you with your EU consumer compliance needs.