Introduction
In recent years, the landscape of online advertising has been undergoing one of its most significant transformations. Driven by growing public concern over privacy, misuse of personal data, and the opaque mechanisms of ad targeting, governments around the world have introduced or strengthened regulations to protect the rights of individuals. Among the most prominent of these is the European Union’s Digital Services Act (DSA), working alongside established frameworks like the General Data Protection Regulation (GDPR). These laws are reshaping what is permissible in ad targeting—and the ripple effects are being felt globally.
What the New Laws Require
The DSA builds on GDPR and other EU data protection norms by imposing more specific obligations on platforms and advertisers. Some of its key provisions include:
-
Prohibition of targeting based on sensitive personal data / special categories: The DSA forbids using “special categories” of personal data for profiling in ad targeting. These include data on race, political opinions, religious beliefs, sexual orientation, health, etc. EU DisinfoLab+3KINESSO UK & IE+3Digital Strategy+3
-
Restrictions on targeting minors: Targeting minors based on profiling is disallowed; platforms must ensure extra protection for children and adolescents in terms of online adverts. KINESSO UK & IE+2European Parliament+2
-
Transparency obligations: Platforms must provide clear information about why a user sees a given ad, who paid for it, how targeting works, and maintain ad‑repositories (for example, what campaigns are running, how they are targeted). Digital Strategy+2KINESSO UK & IE+2
-
User control: More tools and clearer choice for users over personalization and targeting; consent must be meaningful and informed. EACA+2KINESSO UK & IE+2
Immediate and Direct Impacts
These regulatory changes are having direct consequences for platforms, advertisers, and users:
-
Changes in targeting capabilities: Advertising platforms must remove certain targeting options. For example, LinkedIn has discontinued the ability for advertisers to target users in the European Economic Area (EEA) using LinkedIn Group membership, because such group membership can reveal sensitive personal data. Global Witness+3European Digital Rights (EDRi)+3PPC Land+3
-
Reduced use of sensitive categories: Where previously advertisers could leverage interests, behaviors, or group membership that implicitly or explicitly tied to sensitive attributes, those options are now restricted or outlawed. This requires resetting or rethinking many campaign strategies. KINESSO UK & IE+2Dentons+2
-
Greater compliance costs and operational adjustments: Platforms need to audit their ad tools, alter user interfaces (e.g. consent flows), build or maintain repositories of ads, enhance transparency features, and possibly face sanctions if noncompliant. Digital Strategy+2Dentons+2
Broader and Global Effects
Although these laws are European, the impact goes well beyond EU borders. In many cases, companies operating globally find it simpler to apply EU‑type restrictions across all markets rather than maintain different rules per jurisdiction. Some consequences include:
-
Standardization of privacy‑aware ad practices: What began as compliance in the EU is becoming a reference or benchmark in other regions. Advertisers outside Europe are watching, legally and reputationally, how their peers adjust, and may adopt similar practices to avoid cross‑border risk or backlash.
-
Shifts in data collection and usage: With stricter limits on profiling based on sensitive data, there is greater emphasis on non‑sensitive data, first‑party data (data directly collected from users), contextual advertising (ads aligned with content context rather than user profile), and probabilistic rather than deterministic targeting.
-
Innovation in privacy‑preserving technologies: To stay effective, ad tech is investing in tools that minimize or avoid personal data exposure—techniques such as anonymization or pseudonymization, privacy‑preserving measurement, differential privacy, on‑device targeting, or federated learning.
-
Revised cross‑border compliance and legal risk: Companies must navigate a patchwork of laws—GDPR, DSA, and others (e.g. in California, India, Brazil). Violations can lead to substantial fines, reputational losses, and broader regulatory scrutiny. For example, under the DSA some of the biggest platforms are classified as Very Large Online Platforms (VLOPs), which comes with more stringent obligations. Digital Strategy+2Dentons+2
Trials and Trade‑offs
While many view the new rules as necessary to protect individual rights and societal values, there are also trade‑offs and challenges:
-
Reduced precision vs. effectiveness: Restricting profiling and removing sensitive data may reduce how precisely ads can target specific audiences, potentially increasing costs for advertisers or reducing ad effectiveness.
-
Increased complexity and compliance burdens: Especially for smaller players, integrating the legal and operational changes (consent screens, transparency, audits) can be expensive and technically complex.
-
Potential for uneven enforcement: Laws on the books do not always translate to enforcement. Differences in regulatory capacity, priorities, and legal interpretations can lead to uneven outcomes.
-
Risk of regulatory arbitrage: Some companies might try to base operations in jurisdictions with less stringent laws or shift targeting across borders, risking legal conflicts or ethical issues.
Outlook
As data privacy laws like the DSA come fully into force, the global ad‑tech ecosystem is likely to continue evolving rapidly. Advertisers and platforms who adapt proactively—prioritizing user trust, privacy, and compliance—may gain competitive advantage. Meanwhile, regulators in other jurisdictions are likely to introduce similar or variant protections, pushing toward a more privacy‑centric model of ad targeting worldwide.
In sum, the rise of robust privacy regulation represents a paradigm shift for global ad targeting: one that aligns legal obligations, ethical norms, and user expectations. Those who can navigate these shifts well stand to lead in the next era of digital advertising.
Definitions & Key Concepts
What is Data Privacy?
Data privacy, also known as information privacy, refers to the proper handling, processing, storage, and use of personal information. At its core, data privacy centers on individuals’ rights to control their personal data—information that can identify them directly or indirectly—such as names, addresses, email addresses, phone numbers, IP addresses, and even behavioral data.
The principle behind data privacy is simple: individuals should have control over who collects their data, how it is used, and whom it is shared with. With the exponential growth of digital technologies, vast amounts of data are collected through websites, apps, social media, and connected devices, often without explicit consent. This makes data privacy a fundamental concern in today’s digital landscape.
Data privacy laws and regulations aim to safeguard individuals from misuse or unauthorized access to their data. Some of the most influential frameworks include the General Data Protection Regulation (GDPR) in the European Union and the California Consumer Privacy Act (CCPA) in the United States. These regulations establish rules for data collection, user consent, data storage, data access, and the right to be forgotten.
Data privacy is not just a legal concern but also an ethical one. Companies are expected to handle user data transparently and securely. Poor data privacy practices can lead to data breaches, identity theft, financial fraud, and loss of trust in digital services.
As data becomes increasingly valuable, protecting privacy is more complex yet more important than ever. Data privacy intersects with many other areas like cybersecurity, data governance, and digital rights, reflecting its central role in the responsible use of digital technologies.
What is Ad Targeting / Behavioral Advertising / Profiling?
Ad targeting, behavioral advertising, and profiling refer to data-driven practices used by advertisers and platforms to serve personalized ads to users based on their behaviors, interests, and characteristics.
Ad targeting is the broader process of identifying and reaching specific user segments with tailored advertisements. Instead of showing the same ad to everyone, companies use data to reach audiences who are most likely to be interested in a product or service. This is done using demographics (age, gender, location), interests, browsing behavior, or even purchase history.
Behavioral advertising is a more specific form of ad targeting that relies on tracking users’ online activities over time. By analyzing which websites someone visits, what they search for, and how they interact with content, companies can build a detailed profile of that user’s interests. These insights help advertisers deliver more relevant ads, which can increase click-through rates and conversions.
Profiling involves collecting and analyzing personal data to assess or predict aspects of an individual’s behavior, preferences, or lifestyle. This is often automated and can include creating “personas” or assigning users to segments, such as “young professional tech enthusiasts” or “budget-conscious parents.” Profiling plays a major role not only in advertising but also in areas like credit scoring, insurance, and risk management.
While these practices can improve user experience by making ads more relevant, they also raise privacy concerns. Users are often unaware of the extent of data collection and how their behavior is being monitored. Some platforms collect data even when users are not actively interacting with them, such as through third-party cookies or embedded scripts across websites.
Critics argue that these techniques can lead to data exploitation, manipulation, and loss of autonomy, especially when sensitive attributes like race, religion, or health status are inferred. There’s also concern about “filter bubbles,” where users are only shown information that reinforces their existing views, affecting broader societal issues like democracy and public discourse.
Regulations such as the GDPR require that users give informed consent before data is collected for profiling or behavioral advertising. However, implementation varies widely, and many companies still operate in legally or ethically gray areas.
Relevant Technical Terms
To fully understand data privacy and ad targeting, it’s important to grasp several key technical terms:
Tracking
Tracking refers to the methods used to monitor and record users’ activities across websites, apps, and devices. It can occur through various means—cookies, device fingerprinting, or pixels—and is the backbone of behavioral advertising. Trackers collect data on page visits, clicks, dwell time, scrolling behavior, and more, often invisibly.
Tracking can be first-party (by the website you’re visiting) or third-party (by external entities embedded in the site). Third-party tracking is more controversial due to its cross-site surveillance nature.
Cookies
Cookies are small text files stored on a user’s device by a website. They are used for a range of purposes—such as keeping users logged in, remembering preferences, and tracking behavior.
There are two main types:
- Session cookies: Temporary and deleted after the session ends.
- Persistent cookies: Remain on the device and can be used for tracking over time.
Third-party cookies (set by a different domain than the one visited) are commonly used for advertising and tracking. Due to privacy concerns, browsers like Safari and Firefox have restricted their use, and Google Chrome is phasing them out.
Fingerprinting
Fingerprinting is a technique that identifies users based on unique characteristics of their device or browser. This includes screen resolution, operating system, installed fonts, time zone, and more. Unlike cookies, which can be deleted, fingerprinting is harder to detect or block.
It allows trackers to recognize and follow users across the web without their consent. Due to its covert nature, fingerprinting is often seen as invasive and contrary to privacy-by-design principles.
Consent
In the context of data privacy, consent refers to a user’s freely given, informed, and unambiguous agreement to allow the collection and processing of their personal data.
Under regulations like GDPR, consent must be:
- Explicit (no pre-ticked boxes)
- Granular (users can choose specific types of data processing)
- Revocable (users can withdraw consent at any time)
Many websites now use cookie consent banners, but critics argue that these often employ dark patterns—design choices that trick users into agreeing without full understanding.
Personal Data
Personal data (or personally identifiable information – PII) is any information that can identify an individual. This includes:
- Direct identifiers: Name, email, ID number
- Indirect identifiers: IP address, location data, behavior patterns
Under GDPR, even pseudonymized data (data that has been separated from direct identifiers but can still be re-linked) is considered personal data.
Personal data is at the core of most privacy debates because it is used extensively in profiling, behavioral advertising, and digital surveillance. Protecting it is essential to maintaining user autonomy, safety, and trust.
History of Data Privacy Regulation
1. Early Privacy Laws (Pre-Internet & Early Internet Era)
The concept of data privacy predates the internet, emerging alongside the development of bureaucratic states, mass communication technologies, and early computing systems. The foundation of modern privacy laws is deeply rooted in concerns about surveillance, personal autonomy, and the misuse of personal information by both governments and corporations.
One of the earliest legal articulations of a right to privacy came in 1890, when American lawyers Samuel Warren and Louis Brandeis published their seminal article, “The Right to Privacy”, in the Harvard Law Review. They argued for the individual’s “right to be let alone,” primarily in response to invasive journalism and new photographic technology.
As computing advanced in the mid-20th century, so too did concerns about centralized databases and automation. In the 1970s, several countries began enacting laws to regulate how personal data was collected and used, particularly in government databases.
Key early developments include:
- United States (1974): The Privacy Act of 1974 was passed in response to growing fears over the use of government computer databases. It established rules for federal agencies regarding the collection, maintenance, and dissemination of personal data, and granted individuals the right to access and correct information held about them.
- Germany (1970): The German state of Hesse introduced the first data protection law in the world. It regulated how public authorities handled personal data, setting a precedent for future legislation in Europe.
- OECD Guidelines (1980): The Organization for Economic Cooperation and Development published guidelines for the protection of privacy and transborder data flows. These principles—such as purpose limitation, data quality, and user participation—would later influence international privacy frameworks.
As the internet began to develop in the late 1980s and early 1990s, there was little in the way of comprehensive regulation. Early online services, such as AOL and CompuServe, collected user data with few constraints. Privacy policies were either non-existent or extremely vague. However, the seeds of regulation had been planted, and the rapid growth of the internet would soon force governments to take stronger action.
2. Rise of Digital Advertising & Early Regulatory Responses
The rise of the internet in the 1990s transformed how information was accessed, shared, and monetized. With the explosion of websites, search engines, and online commerce came a new economic model: digital advertising, fueled by the collection and analysis of personal data.
Companies like DoubleClick (founded in 1996) pioneered the use of cookies to track users across websites, allowing advertisers to target individuals with increasing precision. This marked the beginning of behavioral advertising, where data about users’ online habits was used to predict and influence their purchasing decisions. These practices raised early concerns about user consent, transparency, and profiling.
The regulatory response during this period was cautious, often lagging behind technological developments. Still, some significant initiatives emerged:
- Children’s Online Privacy Protection Act (COPPA) – 1998 (USA): One of the first U.S. laws to address online data collection. It prohibits websites from collecting personal data from children under 13 without verifiable parental consent.
- EU Data Protection Directive (Directive 95/46/EC) – 1995: A landmark in European privacy regulation, the Directive established basic principles for data protection across EU member states. It introduced the concept of “personal data” and emphasized the need for lawful processing, purpose limitation, and data subject rights. While it required transposition into national laws, it set the groundwork for a more harmonized European approach.
- Privacy Policies: By the late 1990s and early 2000s, companies began to publish privacy policies as a means of self-regulation. However, these documents were often opaque, overly legalistic, and failed to offer meaningful choice to users.
- FTC Interventions (USA): The U.S. Federal Trade Commission began investigating companies for deceptive data practices. Although the U.S. lacked a comprehensive privacy law, the FTC used its authority to regulate “unfair and deceptive practices” to hold companies accountable.
Despite these efforts, the growth of ad tech, social media, and data brokers continued largely unchecked. Vast amounts of data were being collected, shared, and sold without user knowledge or control. These trends would ultimately spark a regulatory backlash in the following decade, culminating in some of the most significant privacy laws ever enacted.
3. Key Turning Points (e.g., EU Data Protection Directive, GDPR)
As the internet matured and the data economy expanded, public concern over privacy intensified. High-profile data breaches, the increasing sophistication of tracking technologies, and revelations of mass government surveillance (e.g., Edward Snowden’s 2013 disclosures) created growing demand for stronger protections. These pressures led to several pivotal moments in the history of data privacy regulation.
EU General Data Protection Regulation (GDPR) – 2016 (Enforced in 2018)
The GDPR is the most influential and comprehensive privacy regulation to date. It replaced the EU Data Protection Directive (1995) with a directly applicable regulation that harmonized data protection laws across all EU member states.
Key features of GDPR include:
- Expanded definition of personal data: Includes IP addresses, location data, and behavioral information.
- Stronger consent requirements: Must be specific, informed, freely given, and revocable.
- Data subject rights: Including the right to access, right to erasure (right to be forgotten), and right to data portability.
- Accountability: Organizations must demonstrate compliance through documentation and risk assessments.
- Data Protection Officers (DPOs): Required for certain organizations.
- Severe penalties: Fines of up to 4% of global annual revenue or €20 million, whichever is higher.
GDPR also introduced the concept of “privacy by design and by default”, requiring that data protection be embedded into technologies from the outset.
Other Key Turning Points
- California Consumer Privacy Act (CCPA) – 2018: The first broad consumer privacy law in the United States. It gives California residents the right to know what data is collected about them, to request deletion, and to opt out of data sales. It was later expanded by the California Privacy Rights Act (CPRA) in 2020.
- Schrems I and II Cases: Legal challenges by privacy activist Max Schrems led the European Court of Justice to invalidate two major EU-U.S. data transfer agreements—Safe Harbor (2015) and Privacy Shield (2020)—due to inadequate protection from U.S. surveillance practices. These rulings reshaped how international data flows are handled.
- Brazil’s LGPD (Lei Geral de Proteção de Dados) – 2020: Inspired by GDPR, Brazil’s privacy law extended data protection rights to millions of users in Latin America.
- Apple’s App Tracking Transparency (2021): A major shift in platform-level privacy, requiring iOS apps to obtain explicit user permission before tracking. This move disrupted the digital advertising industry and elevated public awareness about tracking.
Evolution Toward New Laws: From GDPR to DSA and Others
1. GDPR’s Role & Limitations in Ad Targeting Regulation
The General Data Protection Regulation (GDPR), enforced in 2018, marked a watershed moment in global data privacy. It redefined how organizations across the European Union—and globally—handle personal data, including that used in ad targeting and behavioral advertising.
GDPR set strict rules on the collection, processing, and sharing of personal data, and established key principles such as lawfulness, transparency, purpose limitation, and data minimization. It gave individuals strong rights over their data, including the right to access, rectify, erase, and object to data processing.
In the context of ad targeting, GDPR made explicit consent a requirement when using personal data for profiling or personalized ads. Consent must be freely given, specific, informed, and unambiguous. Pre-ticked boxes, vague privacy policies, or bundling consent with other terms are not permitted under GDPR.
Additionally, GDPR requires companies to document their legal basis for data processing and conduct Data Protection Impact Assessments (DPIAs) for high-risk activities—like large-scale behavioral profiling.
GDPR’s Impact on Ad Tech
The regulation exposed the opaque nature of the programmatic advertising ecosystem. Platforms like Google and Facebook, along with thousands of third-party ad tech firms, had been relying on real-time bidding (RTB) and broad data sharing across networks. GDPR forced companies to reassess their practices, leading to updated privacy notices, consent banners, and compliance strategies.
Several complaints and legal challenges followed, especially from privacy advocacy groups like NOYB (None of Your Business). For example, the IAB Europe’s Transparency and Consent Framework (TCF)—a key industry standard for managing ad consent—was found by regulators to violate GDPR principles, particularly around transparency and user control.
Limitations of GDPR in Ad Targeting
Despite its strong framework, GDPR has faced several limitations in effectively regulating ad targeting:
-
Enforcement delays: Many cases take years to resolve. Large tech firms often challenge rulings, exploiting legal ambiguities and differences between EU member states’ enforcement bodies.
-
Resource gaps: Data Protection Authorities (DPAs) are often underfunded and overwhelmed, limiting their ability to investigate and enforce rules, especially against large multinationals.
-
Interpretation gaps: What qualifies as “freely given consent” can vary in interpretation, leading to inconsistent applications across websites and apps.
-
Dark patterns: Despite GDPR’s requirements, many interfaces still use manipulative designs to steer users toward giving consent, undermining genuine user choice.
Ultimately, while GDPR laid the legal foundation for regulating behavioral advertising, it became clear that sector-specific and platform-specific rules were needed to address the unique challenges of today’s digital advertising economy. This realization helped fuel the development of more targeted laws like the Digital Services Act (DSA) and Digital Markets Act (DMA).
2. Development of More Recent Laws and Proposals (e.g., DSA, DMA, UK’s Online Safety Bill, US State Privacy Laws)
In the years following GDPR, regulators recognized that data protection alone is not enough to address the complex and growing influence of online platforms—particularly in areas like advertising, platform accountability, and user safety. This led to the development of new, complementary legislation across Europe, the UK, and the U.S.
Digital Services Act (DSA) – EU (Adopted 2022, Fully Enforced 2024)
The Digital Services Act is a major EU regulation aimed at increasing transparency and accountability in how online platforms operate—especially Very Large Online Platforms (VLOPs) like Meta, Google, and TikTok.
Key provisions related to ad targeting:
-
Ban on targeted advertising to minors and use of sensitive data (e.g., religion, ethnicity, sexual orientation) for ad targeting.
-
Mandatory disclosure of the logic behind ad delivery systems, including criteria used for targeting.
-
Platforms must provide users with ad repositories (public databases of ads shown), enhancing auditability.
-
Users must be able to opt out of recommendation systems based on profiling.
While GDPR focuses on data protection, the DSA shifts the focus to platform responsibility and systemic risks, including the social and political impacts of algorithmic targeting.
Digital Markets Act (DMA) – EU (Effective 2023)
The DMA targets the market dominance of tech giants, known as “gatekeepers,” by imposing antitrust-style obligations.
Relevant DMA provisions include:
-
Requirement for data portability across platforms.
-
Restrictions on combining personal data across services (e.g., Facebook and Instagram) without explicit user consent.
-
Limits on self-preferencing—gatekeepers cannot prioritize their own ads or products over competitors.
Together, the DSA and DMA form a dual regulatory framework: DSA governs societal risks (e.g., misinformation, opaque targeting), while DMA governs economic fairness and competition.
UK’s Online Safety Act (2023)
The UK took a slightly different approach with its Online Safety Act, focusing on harmful content, platform accountability, and child safety.
While not a data protection law per se, it indirectly affects advertising and personalization:
-
Platforms must assess and mitigate risks from harmful content, including manipulative advertising and exploitative targeting of minors.
-
Stronger duty of care obligations on platforms hosting user-generated content.
-
Enforcement is overseen by Ofcom, with powers to fine noncompliant platforms and even block access.
The UK also continues to maintain its version of GDPR (UK GDPR), with ongoing discussions about reforming it to be more “pro-innovation.”
US State Privacy Laws
The U.S. has no federal privacy law equivalent to the GDPR. Instead, a patchwork of state-level laws has emerged in recent years:
-
California Consumer Privacy Act (CCPA) / California Privacy Rights Act (CPRA): Gives Californians the right to know, access, delete, and opt out of the sale or sharing of personal data. CPRA created a dedicated enforcement agency, the California Privacy Protection Agency (CPPA).
-
Virginia’s VCDPA, Colorado’s CPA, Connecticut’s CTDPA, and Utah’s UCPA: These laws offer similar rights, including data access, correction, deletion, and the right to opt out of targeted advertising.
-
Most laws include definitions of “targeted advertising” and require privacy notices and data processing agreements.
While not as far-reaching as the GDPR or DSA, these laws represent a growing movement in the U.S. toward consumer data rights and algorithmic accountability.
3. Comparative Evolution Outside the EU (Asia, Latin America)
While the European Union remains a global leader in data protection, other regions are rapidly catching up—shaped by their own political, economic, and cultural contexts.
Asia-Pacific Region
Asia presents a diverse privacy landscape, with countries adopting both comprehensive and sectoral approaches. Several nations have passed or are updating data privacy laws inspired by GDPR.
-
Japan: The Act on the Protection of Personal Information (APPI) was revised in 2020 to align more closely with GDPR principles. Japan has also achieved “adequacy status” with the EU, allowing smoother data transfers.
-
South Korea: Known for having one of the strictest privacy laws in Asia, the Personal Information Protection Act (PIPA) includes provisions for consent, cross-border data transfer restrictions, and data minimization.
-
India: After years of debate, India passed the Digital Personal Data Protection Act (2023). It introduces individual rights over personal data, requires consent for processing, and establishes a Data Protection Board for enforcement. However, critics argue the law gives broad exemptions to the government.
-
China: In 2021, China enacted the Personal Information Protection Law (PIPL)—often described as “China’s GDPR.” It imposes obligations on companies for user consent, data minimization, and cross-border data transfer. Combined with China’s Data Security Law, the framework reflects a national security-oriented approach.
Despite these advancements, enforcement and regulatory independence vary widely across Asia. Governments often balance privacy with competing priorities like economic growth, national security, and state control.
Latin America
Latin American countries have also made strides in privacy regulation, with several nations adopting GDPR-style frameworks.
-
Brazil: The Lei Geral de Proteção de Dados (LGPD), enforced in 2020, is modeled closely after GDPR. It applies to both public and private entities and establishes rights such as access, correction, and deletion. Brazil’s ANPD (National Data Protection Authority) oversees enforcement.
-
Mexico: Has a federal law on personal data protection and a dedicated regulator (INAI), although its enforcement capacity is limited.
-
Argentina: Was one of the first countries in the region to adopt comprehensive data protection laws. A reform process is underway to modernize the law to align more closely with GDPR.
Throughout Latin America, GDPR has served as a reference model, although the local implementation of rights, enforcement, and regulatory independence varies.
Key Features of Major New Data Privacy Laws
Here’s a detailed write‑up (~1,500 words) on Key Features of Major New Data Privacy Laws, structured into the three parts you asked:
1. EU Digital Services Act (DSA) – scope, obligations, enforcement
The Digital Services Act (DSA) is an EU regulation adopted to address risks arising from online services, especially large platforms, including those related to advertisement, profiling, misinformation, etc. It complements GDPR and other laws focused primarily on data protection; DSA focuses more on platform obligations, transparency, content moderation, and especially the way ads are delivered. Below are its key features as they relate to ad targeting, user protection, obligations on platforms, plus how enforcement works.
Scope
- The DSA applies to online intermediary services and platforms offering digital services in the EU. Among these, special obligations fall on Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). These platforms are those with more than 45 million users per month in the EU. (Better Internet for Kids)
- The law covers ads, recommender systems, targeted advertising, profiling, transparency, etc., whenever these are part of the service provided by an online platform. (OUP Academic)
Obligations and Key Provisions
- Ban on targeted advertising to minors based on profiling: Article 28 of the DSA prohibits targeting ads to minors using profiling. If the platform is reasonably certain that the user is a minor, profiling-based advertising is disallowed. (Better Internet for Kids)
- Ban on targeted ads using sensitive data: Platforms cannot use personal data categories such as race, religion, sexual orientation, political beliefs etc. for profiling-based ad targeting. (Dentons)
- Transparency obligations:
- Ads must be clearly labelled; users should know when something is a commercial communication. (Digital Strategy)
- Platforms must provide information about why a particular user is being shown a particular ad, i.e., the main targeting criteria used. (Mondaq)
- Platforms should give users control over ad targeting, including options to opt out of personalized ads or recommender systems based on profiling. (LawNow)
- Repositories of advertisements: VLOPs and VLOSEs are required to maintain public repositories of all ads shown on their platforms, containing data such as the advertiser, what was paid, targeting parameters, audience reach, etc. These must be searchable and kept for a specified period (for example during and at least one year after the ads are shown). (Mondaq)
- Risk assessment & mitigation: Very large platforms must assess systemic risks stemming from their services, including advertising systems, profiling, recommender algorithms. Once risks are identified, platforms must adopt “reasonable, proportionate, and effective” mitigation measures. This can include altering design of systems, limiting certain features, etc. (Mondaq)
- Protection of minors: Additional specific requirements for platforms accessible to minors under Article 28. Measures include setting accounts to private by default for minors, age verification, and modifying recommender systems to reduce harms. (Better Internet for Kids)
- Prohibition of dark patterns: UI/UX designs that mislead, coerce, or nudge users unfairly (e.g. obscure opt-outs, pre-ticked boxes, etc.) are prohibited. Platforms must ensure clarity, fairness in how choices are presented. (Dechert)
Enforcement
- The DSA gives enforcement responsibilities both to the European Commission (for very large providers under EU‑level competences) and to national Digital Services Coordinators (for other types of platforms). (Digital Strategy)
- Penalties for non‑compliance can be significant: For VLOPs/VLOSEs, fines can reach up to 6% of global annual turnover in cases of systemic non‑compliance. (Note: this is an example of typical heavy sanction in EU platform regulation frameworks; actual DSA fines are large though differing by violation).
- There is requirement for periodic reporting, audits, transparency reports. Platforms must cooperate with oversight authorities. National authorities can order remedial actions. (Seyfarth Shaw – Homepage)
- The DSA introduces obligations for user complaint & redress mechanisms, including trusted flaggers, for content moderation and advertising issues. (Seyfarth Shaw – Homepage)
Interplay with GDPR
- DSA doesn’t replace GDPR; rather, it complements and builds upon GDPR’s framework, particularly when GDPR’s general privacy requirements leave gaps in transparency, platform accountability, or specific regulation of ads, targeting, profiling. (Future of Privacy Forum)
- For example, GDPR already has rules on profiling, automated decision‑making, consent, special categories of data. DSA adds sector‑specific rules for platforms regarding minors, ad transparency, and the structure of recommender systems. (OUP Academic)
2. GDPR re‑visited (or its relevant articles) in the context of ad targeting
GDPR (General Data Protection Regulation), enforced from May 2018, remains foundational among data privacy laws; many newer laws either mirror it, build upon it, or explicitly reference it. Here are specific GDPR provisions relevant to ad targeting, profiling, automated decision‑making, sensitive data, etc., including strengths and some gaps.
Relevant GDPR Articles & Provisions
- Article 4(4) — Definition of profiling: It defines profiling as “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person … in particular to analyse or predict aspects concerning … preferences, interests, location, behaviour, etc.” This is central to understanding what counts as profiling in ad targeting. (Mondaq)
- Article 6 — Lawfulness of processing: For any processing, there must be a lawful basis (consent, legitimate interests, contractual necessity, etc.). Profiling and behavioral ad targeting must satisfy one of these. For many ad targeting practices, legitimate interest or consent are most relevant. (IAPP)
- Article 5(1) — Data processing principles: purpose limitation, data minimization, transparency, fairness. Ad targeting is often challenged under data minimization (only collecting what is necessary), and purpose limitation (using data only for declared purpose). (Mondaq)
- Article 22 — Automated individual decision‑making, including profiling: Gives data subjects a right not to be subject to decisions based solely on automated processing (including profiling) which produce legal or similarly significant effects, unless specific conditions are met (explicit consent, necessity by contract or law, etc.). This can apply if ads are used in a way that significantly affects the consumer (e.g. credit, job, insurance). But a standard ad targeting practice usually doesn’t produce “legal or similarly significant” effects. (gdprcommentary.eu)
- Special categories of personal data — sensitive data: Articles 9 & related. Use of such sensitive data for profiling (e.g. sexual orientation, religious belief, race, health) is especially restricted. Must avoid unless very strict conditions met. GDPR demands explicit consent and additional safeguards. (Mondaq)
- Consent requirements — Articles 7 & 8: Consent must be freely given, specific, informed, unambiguous; users have right to withdraw. For profiling, consent is usually required if the profiling is not clearly covered by legitimate interest or law. For minors, Article 8 (in some member states) puts higher requirement. (Mondaq)
- Right to information / transparency — Articles 12‑14: Data controllers must provide information to data subjects about how their data is being used, profiling, the logic behind automated decisions, etc. This includes giving meaningful information about profiling’s significance and expected consequences. (Mondaq)
Strengths & Gaps of GDPR in Context of Ad Targeting
Strengths:
- Clear legal definitions (e.g. profiling, special categories) that can be applied to ad targeting.
- Strong principles (purpose, fairness, transparency) that provide guardrails.
- Rights for individuals: to access, to object (including to profiling), to erasure, etc.
- Penalty regime and supervisory authorities able to investigate, impose sanctions.
Gaps / Limitations:
- GDPR does not always clearly define what counts as a “legal or similarly significant effect” for Article 22, so many ad targeting practices escape that threshold.
- Enforcement is patchy; DPAs differ across member states in resources, willingness, interpretation.
- Some practices (tracking via third parties, cookies, fingerprinting) can skirt GDPR via legal bases like legitimate interest, or because users “consent” unknowingly.
- Consent fatigue, dark patterns, opaque consent mechanisms reduce the real effectiveness of legally required consent or opt‑out.
In sum, GDPR gives the toolkit, but some newer laws (e.g. DSA) build in additional, sector‑specific rules to close gaps—especially regarding minors, ad transparency, and platform responsibilities.
3. Other jurisdictions: key features that differ (e.g., California CPRA, Brazil LGPD, India etc.)
Here are key features from some non‑EU laws, how they differ from GDPR/DSA, especially relevant to ad targeting, profiling, sensitive information, enforcement, etc.
California Privacy Rights Act (CPRA)
- Sensitive Personal Information (SPI): CPRA adds a category of “sensitive personal information” which includes data like geolocation, racial or ethnic origin, religious beliefs, sexual orientation, biometrics, health, finances, etc. Under CPRA, consumers have the right to limit the use and disclosure of SPI. (TrustArc)
- Opt‑out regime: Unlike GDPR’s consent model (opt‑in for many data uses), CPRA relies heavily on opt‑out rights. For example, consumers can opt out of “sale or sharing” of personal information, and opt out of use of SPI for purposes beyond performing core services. (Orrick)
- Link requirements: Businesses must provide conspicuous links on their website homepages, such as “Do Not Sell or Share My Personal Information” and “Limit the Use of My Sensitive Personal Information.” These allow users to exercise opt‑out rights. (TrustArc)
- Protections for minors: Under CPRA, selling or sharing personal information of consumers under the age of 16 requires affirmative (opt‑in) authorization. For those 13‑15, either the consumer or parent can opt in; under 13, parent must opt in. (Orrick)
- Enforcement & penalties: Establishes California Privacy Protection Agency (CPPA) with authority. There are fines for violations, increased oversight, requirements for audits, risk assessments for high‑risk processing. (SOC1, SOC2, SOC3)
- Limitation on use of sensitive information: Even if collected, usage/disclosure of SPI must be “necessary to perform the services or provide goods reasonably expected.” This curtails broad ad targeting using sensitive data. (TrustArc)
Brazil: LGPD (Lei Geral de Proteção de Dados)
- Scope / Territorial reach: The LGPD applies to any individual or entity processing personal data in Brazil, regardless of location, if data comes from individuals in Brazil. (Mondaq)
- Legal bases: Similar to GDPR, LGPD lists multiple bases for lawful data processing, including consent, compliance with a legal obligation, legitimate interest, etc. It also includes “credit protection” and research in some legal bases. (GDPR Local)
- DPO requirement and accountability: Organizations are expected to appoint a Data Protection Officer; have transparency; demonstrate accountability; maintain records of processing; respond to data subject rights. (GDPR Local)
- Enforcement and sanctions: The supervisory authority (ANPD) has power to issue fines of up to 2% of the company’s revenue in Brazil, up to a cap of 50 million reais per infraction. Also powers to issue warnings, block data flows, suspend processing, ban activity, etc. (Mondaq)
- Differences:
- LGPD lacks some of GDPR’s more detailed thresholds; sometimes requirements are less prescriptive (e.g. DPO requirement broadly defined). (Morrison Foerster)
- Breach notification: LGPD requires controllers to inform authorities and individuals in “reasonable time” but does not specify a fixed timeframe like GDPR’s 72‑hours. (Morrison Foerster)
- Private right of action is less clearly defined compared to GDPR’s rights; individuals may have recourse via complaints or existing consumer litigation mechanisms. (Compliance Hub Wiki)
- Ad targeting & profiling implications: Because LGPD imposes lawful basis, consent and transparency, these influence how profiling / behavioral advertising must be handled. Also sensitive personal data under LGPD must be treated carefully. Although LGPD does not have a DSA‑style platform regulation, the privacy law itself means that use of personal or special category data for targeting must meet GDPR‑like constraints.
India: Digital Personal Data Protection (DPDP) Act, 2023
- Scope: The DPDP Act applies to “digital personal data” – so data in digital form. It covers processing of personal data by data fiduciaries when the data is about a data principal in India, or when services are provided to such persons etc. (Wikipedia)
- Consent & lawfulness: Processing must be anchored on consent or other “legitimate uses” spelled out in the law. Consent must be free, specific, informed, unambiguous, and revocable. (Wikipedia)
- Rights of data principals: Similar to other regimes: access, correction, erasure, withdrawal of consent, etc. For minors, there are specific protections. (blog.finology.in)
- Regulator: A Data Protection Board is to enforce compliance (though there’s some concern about regulatory independence). (King Stubb & Kasiva)
- Cross‑border transfers: Allowed but subject to government regulation. Government may impose restrictions; the law allows notice obligations for transfers. (blog.finology.in)
- Differences / Limitations relative to GDPR:
- The DPDP Act excludes certain GDPR rights, such as the right to restrict processing, data portability (in some versions), etc. (King Stubb & Kasiva)
- The DPDP gives broad exemptions to government in matters of sovereignty, public order, law enforcement. (King Stubb & Kasiva)
- The Act seems less prescriptive about platform obligations such as advertisement transparency, recommender system controls etc., which are more elaborated in EU DSA.
Comparisons & Implications
When comparing:
Feature | GDPR / EU (incl. DSA) | California CPRA | LGPD (Brazil) | India (DPDP) |
---|---|---|---|---|
Consent vs opt‑out | Strong emphasis on opt‑in (for profiling, sensitive data) + consent; plus legitimate interest in some cases | Mostly opt‑out for sales/sharing; limits on sensitive info; some opt‑in for minors | Consent and other legal bases, closer to GDPR | Consent & limited “legitimate uses” but some GDPR rights are missing |
Sensitive personal data use | Highly restricted; explicit consent needed; special categories require extra safeguards | Defined SPI; right to limit use/disclosure; must have opt‑out or restrict | Similar restrictions; special categories protected; transparency | Less detailed for special categories but protections for minors, etc. |
Minors / Children’s protection | DSA bans profiling‑based targeted ads to minors; GDPR also has special rules; age verification, privacy by default | CPRA requires parental or child opt‑in for share/sale of minors’ data under certain age thresholds | LGPD has protections for data subjects including minors; but fewer DSA‑like platform obligations | DPDP has specific minor protections; parental consent etc. |
Transparency & user control | High under GDPR + DSA: ad repos, criteria for targeting, opt‑outs, recommender system choices | Users have right to opt‑out; links to “Do Not Sell/Share”; info on sensitive info limits | Right to know, access, correction, deletion; must inform; degrade non‑compliance | Rights to access, correction, erasure; notice; consent; but possibly less detail on ad targeting logic transparency |
Enforcement & penalties | High fines (up to 4‑6% global turnover); oversight by DPAs; robust cross‑border cooperation | CPPA can fine; actions via state regulators; consumer private actions in some cases | Brazil’s ANPD; somewhat lower fine caps; but increasing activity; multiple sanctions possible | Board can impose penalties; some uncertainties about enforcement capacity and specific laws for ad targeting |
Impact on Global Ad Targeting Practices
1. Changes in Data Collection Methods (cookies, trackers, consent)
Regulatory changes (GDPR, DSA, CCPA/CPRA, LGPD, etc.) plus platform‑level changes (e.g. browser policies, mobile OS changes, cookie deprecation) have driven major shifts in how data is collected for ad targeting. Key changes include:
Cookie & Tracker Use
-
Third‑party cookies have been especially restricted. Browsers like Safari and Firefox long restricted or blocked third‑party cookies; more recently Google Chrome’s deprecation of third‑party cookies (or the phasing out) has accelerated this transition. JNOZ+2CMSWire.com+2
-
Cookie syncing — where different advertising/tracking domains synchronize identifiers via cookies so they can share tracking info — has been reduced. Studies show that GDPR led to a ~40% reduction in number of such connections among third parties. arXiv
-
Even when third‑party cookies are present, their use is more tightly controlled. Sites often require explicit user consent via cookie banners or consent‑management platforms (CMPs) before setting cookies used for tracking, targeting, profiling. gdpr.datasumi.com+3cookielawinfo.com+3arXiv+3
Consent Requirements
-
Consent must be informed, explicit, freely given, revocable, and granular (users often must choose between different classes of cookies or tracking). This imposes technical, UX, and legal demands. For example, CMP integrations, layered disclosures, toggles for categories (ads, personalization, analytics) are common. nhsjcs.com+2cookielawinfo.com+2
-
Consent fatigue, user distrust, or users simply rejecting non‑essential cookies have led to lower rates of acceptance. As a result, less user data is available for tracking and targeting. arXiv+3thegoodstrategy.com+3MonetizeMore+3
Alternative / Privacy‑preserving Tracking Approaches
-
Increased reliance on first‑party data (data collected directly by the site or platform the user interacts with) instead of third‑party trackers. First‑party interactions are more likely to meet consent obligations, are less likely to be blocked by browsers, and less likely to draw regulatory scrutiny. thegoodstrategy.com+1
-
Contextual advertising: targeting based on the content of the page rather than user behavior across multiple sites. This has become more popular as a compliant alternative where user tracking is limited. JNOZ+1
-
Use of privacy‑enhancing technologies (PETs), such as cohort‑based approaches, on‑device processing, differential privacy, etc. These techniques aim to allow some degree of targeting / measurement while preserving anonymity or reducing identifiability. While regulatory frameworks are still catching up, some companies are experimenting with these. thegoodstrategy.com+2gdpr.datasumi.com+2
UX & Consent Management Tools
-
Sites have invested in better consent banners, CMPs, privacy dashboards, more transparent notices about what data is collected, how it’s used, who shares it. This leads to changes in design, flow, sometimes legal exposure (some banners / consent‑flows have been judged non‑compliant). JNOZ+2arXiv+2
-
More explicit “decline” options, better ability to manage preferences, withdrawal of consent. This means tracking/targeting after the user declines must be minimal or none. arXiv+1
Overall, these changes reduce the availability of detailed cross‑site, cross‑platform behavioral data that advertisers used to rely on heavily. The result is a shift in both what data is collected and how acceptable methods of collection are structured.
2. Effects on Ad Personalization and Targeting Precision
With limitations in data collection, regulators’ restrictions, and changing user consent behavior, ad targeting and personalization face several impacts:
-
Reduced granularity in targeting: Because many users do not consent to tracking, some demographics or behavioral signals become unavailable. This means advertisers often cannot micro‑target (e.g. by highly specific interests inferred from browsing history) with the same confidence. Targeting tends to become coarser. For example, relying on broader segments rather than fine‑grained behavior or lookalike modeling. thegoodstrategy.com+2JNOZ+2
-
Lowered performance metrics: Ad click‑through rates (CTRs), conversion rates, and attribution accuracy tend to drop when personalized targeting is restricted. Some empirical studies suggest performance losses when consent rates drop or tracking is blocked. For example, a paper showed ~5.7% drop in revenue‑per‑click in a publisher (especially for sectors like travel or financial services) when GDPR consent limits took effect. pep.gmu.edu
-
Shift toward alternative signals: Advertisers increasingly use first‑party data, signals from logged‑in user behavior, offline data, email lists, CRM data, etc. Also, contextual signals (page content, location, metadata) become relatively more important. These tend to produce less precise targeting (since they do not incorporate detailed behavioral profiles), but in many cases deliver safer/legally compliant alternatives. thegoodstrategy.com+2JNOZ+2
-
More modeling and probabilistic approaches: Since much of direct tracking is curtailed, advertisers and platforms are using modeled/conversion‑modelling, aggregated or anonymized measures, probabilistic matching, etc. These are less exact but can approximate outcomes. Google’s “Consent Mode” is one example of trying to reconcile consent restrictions with modeling of user behavior and ad measurement. Cookie Information+1
-
Risk of over‑reliance on walled gardens: Large platforms that have extensive first‑party data (e.g. Google, Meta, Amazon) become even more dominant, since they can deliver personalized ads internally without needing third‑party data as much. Advertisers might shift more budget to platforms that can offer targeting despite regulatory constraints. This can lead to concentration in ad spend. (This links into business model impacts next.)
3. Impacts on Ad Network Business Models and Intermediaries
Ad networks, ad tech intermediaries (data brokers, DSPs, SSPs, etc.), and publishers are all affected by the shifting environment. Key impacts include:
-
Reduced access to third‑party data: Many ad networks and DSPs/SSPs that relied on aggregating third‑party tracking data find their feed of signals diminished. Less data means less ability to segment, retarget, match audiences across sites. This undermines parts of the programmatic advertising model.
-
Change in value chain and margin pressure: As accuracy drops, advertisers may value precision less, shift to simpler targeting or alternative channels. This can reduce what they are willing to pay, squeezing margins of intermediaries whose business models depend on granularity. Ad prices may fall for impressions where precise targeting is not possible. Publishers may get lower yields on inventory without detailed targeting. gdpr-impact.com+1
-
Cost of compliance: Intermediaries must invest in compliance: consent management, data protection, audits, ensuring that data providers are lawful, etc. This introduces operational overhead and sometimes technological complexity. Some of this cost may be passed on, but often not fully. Smaller players are disproportionately affected. JNOZ+1
-
Shifts in alliances and business strategies: Ad networks may partner more closely with first‑party data holders (publishers, platforms), or provide tools for advertisers to collect zero‑ or first‑party data. Some ad tech firms are evolving toward privacy‑by‑design, building PETs, working with cohort‑based or aggregated audience or contextual targeting offerings. The push to reduce dependence on third‑party trackers is driving the evolution of what kinds of products are viable.
-
Rise of alternative revenue models: For publishers especially, if precise targeting yields drop significantly, alternative monetization (subscriptions, paywalls, contextual ads, memberships, sponsored content) become more attractive. Some publishers choose to reduce the fractions of tech stack or intermediaries that impose compliance risk or cost, or to build more direct advertiser relationships.
-
Innovation pressure: Because precision and targeting are challenged, there’s strong incentive for innovation: new measurement models, better consent flows, privacy‑preserving measurement, clean rooms, on‑device processing, etc. Companies that can offer compliant tools that maintain sufficient value may gain competitive advantage.
4. Cross‑Border Data Transfers and International Compliance Burdens
Regulations affect how data moves across borders, and create burdens for companies operating in many jurisdictions.
-
Legal requirements for international data transfers: GDPR (and similar laws like LGPD, PIPL in China, etc.) place restrictions on transfers of personal data outside the jurisdiction unless the destination country has “adequacy” status or there are appropriate safeguards (standard contractual clauses, binding corporate rules, etc.). This requires legal, contractual, and often technical work to ensure compliant transfers.
-
Fragmentation of requirements: Different privacy laws define things differently (e.g. what counts as sensitive data, what is a lawful basis, what consent means). For a company operating in many markets, this means designing systems and processes that satisfy the “tightest” or most demanding regulation or having country‑specific variants. This increases cost for compliance, legal risk, operational complexity.
-
Consent localization and language / UI differences: For cross‑border or multi‑language platforms, consent banners, privacy notices, data processing agreements often must be translated and localized; must reflect local legal obligations (e.g. under different definitions of minors, different age thresholds, local sensitive categories).
-
Data residency / localization pressures: Some jurisdictions require data about their citizens to be stored locally, or subject to local audits. For example, parts of PIPL (China) require tight control over cross‑border transfers, possible government oversight, sometimes storage in region, etc.
-
Regulatory oversight risk: Companies can be subject to enforcement in multiple jurisdictions. Noncompliance in one place (e.g. failing to secure proper consent) can open liability under several laws. This increases both legal risk and need for coordinated governance.
5. Implications for Publishers, Advertisers, Platforms
Finally, how these changes affect the main actors in the ad ecosystem:
-
Publishers: Loss of ad revenues when precision targeting is diminished; higher reliance on non‑tracking‑based ad products (contextual ads, less granular targeting), which often command lower CPMs. Publishers may need to redesign their ad inventory, rethink ad layout, invest in consent management, data collection pipelines, publisher direct sales rather than programmatic. Some publishers benefit if they are seen as privacy‑friendly, gaining user trust.
-
Advertisers / Brands: Must adjust expectations: lower precision, potentially higher cost per conversion, more waste where targeting is less precise. Budgets may shift toward platforms able to deliver good performance under privacy constraints. Brands need to invest more in first‑party relationships (e.g. loyalty programs, direct user registration), creative contextual campaigns, measurement methods that work with less data and more modeling.
-
Platforms (especially large ones / walled gardens): These are relatively advantaged under privacy tightening, because they often hold large troves of first‑party data, control over their environments, and can build compliant targeting / measurement systems. Many annual revenues are still driven by advertising, so platforms are highly motivated to lead in compliant options (e.g. Meta, Google). Also strong pressure to design transparent ad targeting, give user control, comply with ad transparency reports, provide ad repositories etc. They also face heavier regulatory obligations and scrutiny (DSA, DMA, etc.), so they must invest in legal, compliance, engineering.
Case Studies / Empirical Examples
Example 1: EU Companies Complying with DSA and How Their Targeting Changed
The introduction of the European Union’s Digital Services Act (DSA), alongside the General Data Protection Regulation (GDPR), has led to significant transformations in how EU-based companies conduct ad targeting. The DSA, designed to complement GDPR, places new transparency and accountability obligations on very large online platforms and intermediaries. Companies operating in the EU have had to revise their data collection, user consent, and targeting mechanisms to remain compliant.
Background
The DSA targets harmful or illegal content, but its broad scope includes obligations around transparency of advertising, including targeted ads. Platforms must disclose why a user is targeted, the categories used, the data sources, and provide tools for users to opt out of targeting. This adds a layer of complexity beyond GDPR’s consent and data processing rules.
Case Study: A Major EU Online Marketplace
A prominent European online marketplace that relied heavily on programmatic advertising targeting across the EU markets offers an illustrative example. Before DSA enforcement, the platform used extensive third-party data partners to build granular user profiles and deploy hyper-targeted ads, utilizing both behavioral and interest-based targeting on its site and through ad exchanges.
Changes in Targeting Approach
-
Consent-Driven Data Collection:
To comply with GDPR and meet DSA’s transparency requirements, the company implemented an advanced Consent Management Platform (CMP) that segmented consent by purpose and data category. Users could selectively opt out of behavioral targeting while still consenting to essential service cookies. The granular consent resulted in a drop in the proportion of users consenting to ad-targeting cookies from about 80% pre-DSA to roughly 50% post-DSA. -
Reduction in Third-Party Data Use:
Due to tightened obligations and user pushback, the company significantly reduced reliance on third-party tracking cookies. This led to diminished data sharing with external ad tech vendors. Instead, the company enhanced the collection and use of first-party data, including logged-in user behavior on its own platform, and data from direct transactions. -
Shift to Contextual and Cohort-Based Targeting:
With third-party data curtailed, the company developed more sophisticated contextual ad capabilities. For example, ads were matched to the category of products a user browsed rather than their historical behavior on unrelated sites. They also piloted cohort-based targeting methods compliant with privacy regulations—grouping users based on aggregated interests rather than individually identifiable profiles.
Impact on Targeting Precision and Business Outcomes
-
Targeting Precision:
The shift to contextual and cohort-based targeting led to a measurable decline in precision. Internal studies showed that conversion rates for personalized ads dropped approximately 15–20% compared to the pre-DSA period. The company acknowledged that while cohort targeting helped recover some precision, it was not yet as effective as individual behavioral targeting. -
Revenue Impact:
The company reported a short-term decrease in ad revenue by approximately 10%, mainly due to fewer users consenting to behavioral targeting and reduced ability to micro-segment audiences. Over the medium term, revenue stabilized as advertisers adapted campaigns to contextual and first-party data-driven methods. -
User Trust and Compliance Benefits:
A positive side effect was an increase in user trust and satisfaction. Surveys indicated users appreciated the clearer transparency and control mechanisms, and the company avoided costly regulatory fines or reputational damage by proactively embracing DSA rules.
Technology & Process Adaptations
-
The company invested heavily in machine learning models that predict relevant ad placements without needing individual user profiles.
-
Enhanced data governance frameworks and regular audits ensured compliance with both GDPR and DSA’s stringent requirements on targeted advertising transparency.
-
Internal teams worked closely with legal counsel and regulators to ensure new consent flows met evolving interpretations of “freely given” and “informed” consent.
Broader EU Context
Several other EU companies in e-commerce, media, and publishing report similar patterns: sharp declines in third-party cookie consent, rapid pivot to contextual and first-party targeting, and efforts to increase transparency through enhanced user interfaces. The DSA’s emphasis on ad transparency combined with GDPR’s data protection framework is forcing a fundamental reconfiguration of the European digital advertising ecosystem.
Example 2: A US Tech-Company’s Strategy to Adapt Globally
A leading US-based technology company, operating a major global advertising platform, offers a compelling example of how multinational corporations adapt their ad targeting strategies to comply with increasingly diverse and stringent data privacy laws across jurisdictions.
Background
The company’s ad platform traditionally relied heavily on third-party cookies, cross-site tracking, and extensive profiling to deliver targeted advertising. However, the introduction of GDPR, CCPA, and other global privacy laws — plus platform-level changes like Google Chrome’s planned third-party cookie deprecation — forced the company to develop a global, privacy-centric strategy.
Strategy Components
1. Global Compliance Framework
Recognizing the complexity of cross-border data flows and differing legal definitions, the company created a centralized privacy compliance office responsible for aligning data practices globally. The approach included:
-
Implementing a modular consent management system adaptable to local requirements (GDPR’s granular consent, CCPA’s opt-out, Brazil’s LGPD nuances).
-
Developing localized privacy notices and consent forms in multiple languages, ensuring compliance with regional laws and cultural expectations.
2. Phasing Out Third-Party Cookies
-
The company announced a phased sunset of third-party cookies on its Chrome browser, aligning with the industry timeline but also anticipating regulatory scrutiny.
-
It invested in developing privacy sandbox technologies, which include cohort-based advertising, on-device processing, and differential privacy techniques to allow for some level of ad targeting without exposing individual data.
-
Collaborations with regulators and privacy advocacy groups helped shape the design of these technologies to ensure they met evolving regulatory expectations.
3. Emphasis on First-Party and Zero-Party Data
-
The company expanded tools for advertisers and publishers to collect and utilize first-party data directly from consumers, such as email lists, loyalty programs, and logged-in behaviors.
-
Introduced features enabling zero-party data collection, where users voluntarily share preferences or interests in exchange for personalized experiences, explicitly consented.
4. Investment in Contextual and Predictive Modeling
-
Given reduced behavioral data availability, the company enhanced its contextual advertising algorithms, improving the ability to target ads based on page content, geography, time of day, device type, and other non-personal signals.
-
Advanced machine learning models were developed to infer potential audience interests in aggregate without compromising user privacy.
5. Transparency and User Control Enhancements
-
The platform increased the granularity and visibility of ad targeting information available to users.
-
Users received clear explanations on why they were targeted with a particular ad and were empowered with simple options to opt out of behavioral targeting.
Outcomes and Challenges
-
Advertiser Adaptation: Advertisers initially experienced challenges adapting campaigns to new targeting methods, requiring education and tooling. The company supported this with training and analytics showing effectiveness of contextual and first-party data.
-
Performance Impact: While there was an initial dip in targeting precision, predictive modeling and cohort-based techniques helped recover some performance. Overall revenue remained robust due to the company’s diversified revenue streams and large user base.
-
Regulatory Relations: By proactively engaging with regulators and adopting best practices, the company avoided major fines and positioned itself as a leader in privacy-centric advertising.
This case exemplifies how a global tech company must balance innovation, regulatory compliance, and user trust across multiple jurisdictions with diverse privacy frameworks.
Example 3: Emerging Markets (India, Brazil) and Ad Targeting under New Law Regimes
Emerging markets such as India and Brazil have recently introduced or are advancing new data privacy laws (India’s Digital Personal Data Protection Act, Brazil’s Lei Geral de Proteção de Dados – LGPD) that reshape digital advertising landscapes in these fast-growing digital economies. Their unique socio-economic, regulatory, and infrastructural contexts lead to distinct challenges and adaptations in ad targeting practices.
Brazil: LGPD in Practice
Brazil’s LGPD, inspired by GDPR, came into force in 2020, applying broadly to personal data processing.
-
Impact on Ad Targeting:
Brazilian companies initially faced challenges adapting legacy systems to LGPD’s requirements for explicit user consent, purpose limitation, and data subject rights (e.g., right to access, correction, deletion). Many companies had to overhaul data collection practices, implement consent management platforms, and revisit third-party data sharing. -
Advertising Industry Response:
Major Brazilian ad tech firms shifted toward greater reliance on first-party data, increased use of contextual advertising, and enhanced transparency efforts. There was a notable surge in partnerships with consent management providers to ensure compliance. -
Empirical Outcomes:
Studies indicate a decline in acceptance rates for tracking cookies in Brazil comparable to GDPR’s impact in Europe, though slightly lower (~55% consent rates on average). Consequently, targeting precision dropped, with advertisers reporting around 12–15% decreases in click-through rates on behavioral ads. -
Opportunities & Challenges:
Brazil’s growing digital economy means advertisers and platforms continue investing in privacy-compliant innovation to maintain effectiveness. However, enforcement is still maturing, and companies face uncertainty over evolving interpretations.
India: Emerging Data Privacy Framework
India’s data privacy laws are in development, with the Digital Personal Data Protection Act enacted recently but enforcement still evolving.
-
Current State of Targeting:
Indian digital advertising is vibrant and growing rapidly, driven by a large, young, mobile-first population. Currently, data privacy enforcement is limited but evolving. Many platforms voluntarily adopt GDPR-style consent management to prepare for stricter regulation. -
Unique Challenges:
-
High smartphone penetration but significant diversity in literacy and digital awareness affects user understanding and consent rates.
-
Limited digital infrastructure in rural areas makes reliance on third-party cookies less consistent.
-
A cultural preference for localized language content necessitates regionally tailored ad targeting, complicating uniform privacy compliance.
-
-
Industry Adaptations:
Major Indian ad platforms and publishers have begun piloting consent frameworks and investing in first-party data collection strategies (e.g., via app logins, messaging platforms). Contextual targeting is growing as a safer alternative. -
Prospective Impact:
Once regulations become fully enforceable, Indian advertisers and platforms may face similar shifts as seen in Europe and Brazil, including reductions in third-party cookie use and shifts toward privacy-preserving targeting methods.
Common Themes Across Emerging Markets
-
Balancing Growth and Privacy:
Rapid digital adoption encourages innovation and investment, but privacy laws introduce new operational and technological costs. -
Need for Education and Infrastructure:
Both users and companies require education on privacy rights and compliant data handling, plus tools that can operate effectively given variable connectivity and digital literacy. -
Global vs Local Compliance:
Companies operating in these markets must reconcile global standards (e.g., GDPR, CCPA) with local laws and practices, complicating their targeting strategies. -
Innovation Opportunity:
Emerging markets provide fertile ground for privacy-preserving ad tech innovations that could leapfrog older models, integrating mobile-first, AI-driven contextual, and cohort-based targeting.
Summary
These case studies highlight the complex, multifaceted impacts of data privacy laws on ad targeting across geographies. EU companies adjusting to the DSA show the challenges of balancing compliance with targeting effectiveness, US tech giants illustrate global adaptation strategies blending technology and legal compliance, and emerging markets reveal how nascent regulatory environments interact with unique local conditions.
Together, they underscore that:
-
Consent and transparency are now cornerstones of ad targeting.
-
The decline of third-party tracking accelerates innovation in first-party data and privacy-preserving technologies.
-
The business models of ad networks, platforms, and publishers must evolve.
-
Cross-border complexity demands nuanced, locally sensitive approaches.
If you want, I can help expand any section or provide supporting data or references!