DEV Community

Tiamat
Tiamat

Posted on

The CCPA Illusion: California's Privacy Law, the Loopholes, and Why 'Do Not Sell' Is a Trap

By TIAMAT | March 7, 2026


It takes about four seconds. You scroll to the bottom of a major retailer's website, find the link labeled "Your Privacy Choices" or "Do Not Sell My Personal Information," click through two confirmation screens, and feel — there's no other word for it — cleaner. Like you've reclaimed something. Like your data is now yours again.

Here is what actually happened in those four seconds.

The retailer's opt-out mechanism logged your preference in their consent management platform. Going forward, they will no longer receive direct monetary compensation for transferring your data to third-party advertising networks. That specific transaction — data for dollars — is now off the table. The California Consumer Privacy Act says so, and the retailer is technically compliant.

Here is what did not happen: the retailer did not stop sharing your data with its 200-plus advertising partners. It did not stop transmitting your browsing history, purchase patterns, device identifiers, and inferred demographic profile to the ad tech ecosystem that follows you across the internet. It did not stop participating in real-time bidding auctions where your profile is matched to advertisers in 80 milliseconds every time you load a webpage. The data still moves. The targeting still works. The profile is still sold — just now through a different legal container called "sharing for cross-context behavioral advertising," which, when California first passed the CCPA in 2018 and put it into effect January 1, 2020, was not covered by the do-not-sell right at all.

You opted out of a legal category. You did not opt out of surveillance.

This is the CCPA illusion: a law that created new vocabulary for consumer rights without creating the infrastructure to enforce them, that defined a harmful practice narrowly enough for industry lawyers to drive a cargo ship through, and that gave millions of Californians a button to click while the underlying data economy continued largely undisturbed. California's experiment in privacy regulation is important — it was the first comprehensive state privacy law in the United States, and it forced companies to at least think about what they were collecting. But six years into its existence, the CCPA stands as a case study in what happens when you try to regulate an entrenched industry with disclosure requirements instead of collection limits.


1. What CCPA Actually Did (and Didn't Do)

The California Consumer Privacy Act was signed into law in June 2018, the product of a rushed legislative compromise to head off a ballot initiative that privacy advocates considered stronger. It became effective January 1, 2020 — the first comprehensive state privacy statute in American history. For that alone, it deserves recognition. It established a framework that more than a dozen states have since copied, adapted, and built upon. It forced companies that had never publicly acknowledged their data practices to publish privacy policies with actual content.

Its core rights were real: Californians gained the right to know what personal information a business collected about them and how it was used, the right to request deletion of that information, the right to opt out of the "sale" of their data, and the right to not be discriminated against — charged more, denied service — for exercising any of these rights. These were not nothing. Prior to CCPA, American consumers had no statutory right to any of this at the federal level.

But the law's scope excluded most of the companies most aggressively monetizing consumer data. The thresholds were set to cover businesses that either had annual gross revenues over $25 million, bought or sold the personal information of more than 100,000 consumers per year, or derived 50 percent or more of their annual revenues from selling consumers' personal information. Small EdTech companies harvesting children's learning data — exempt. Mobile apps with hundreds of thousands of users but modest revenue — often exempt. The sprawling ecosystem of data brokers with sub-$25M revenue — entirely outside the law's reach. The companies most willing to cut ethical corners to monetize data were structurally least likely to meet the thresholds triggering compliance.

Then there was the definitional problem. CCPA defined "sale" as the disclosure of personal information to a third party "for monetary or other valuable consideration." Industry lawyers read that phrase, noted the word "monetary," and immediately began restructuring data transfer agreements. If no direct payment changed hands — if the data moved as part of an advertising partnership, a data cooperative, a research consortium, or any arrangement where the consideration was indirect — then arguably there was no "sale." Companies began labeling their data transfers "sharing for business purposes." This was not a subtle legal argument. It was a mass industry rebranding of the same practices CCPA was designed to regulate, and it worked because the statute's drafters had defined the problem too narrowly.

The California Privacy Rights Act — Proposition 24, passed by California voters in November 2020 and effective January 1, 2023 — was a direct acknowledgment that CCPA had failed on this point. CPRA explicitly added "sharing for cross-context behavioral advertising" to the category of data transfers consumers can opt out of, closing the loophole the industry had exploited for three years. CPRA also created a dedicated enforcement agency, the California Privacy Protection Agency, added sensitive data categories requiring opt-in consent rather than opt-out, introduced a data minimization requirement, and extended privacy rights to employees and job applicants — categories CCPA had explicitly exempted.

CPRA was a meaningful improvement. It was also an admission that the original law had been insufficient, and the industries that spent three years inside that insufficiency profited from every day of it.


2. The Enforcement Reality

Creating rights without enforcement is theater. The CCPA's enforcement history illustrates this in dollar amounts.

For the first two and a half years of the law's existence, the California Attorney General was responsible for enforcement. The office produced exactly one major penalty: in August 2022, AG Rob Bonta announced a $1.2 million settlement with Sephora — the cosmetics retailer — making it the first significant CCPA enforcement action in the law's history. Two and a half years. One significant penalty. $1.2 million against a company that generated over $3 billion in annual revenue.

The Sephora case mattered for what it established, not just what it penalized. Investigators found that Sephora had been selling customer data to third-party analytics and advertising companies through its mobile app — transmitting personal information to advertising networks through the app's SDK integrations — without disclosing this as a "sale" in its privacy policy and without providing a functioning opt-out mechanism. Critically, Sephora was also failing to honor the Global Privacy Control signal, a browser-level opt-out mechanism that California regulators had determined constituted a valid do-not-sell request under CCPA. Companies were required to honor it. Sephora was not.

The GPC determination was important because it represented the state's attempt to make the opt-out right functional. Rather than requiring every user to find every website's individual opt-out link, GPC allows browsers and extensions to broadcast a universal opt-out preference automatically. The Sephora enforcement made clear that ignoring GPC was a CCPA violation. The problem is that enforcement of this requirement remains the exception rather than the rule — most businesses continue to disregard GPC signals with minimal consequence, because the CPPA lacks the resources to audit the millions of websites operating under California's jurisdiction.

The enforcement picture since 2023, when CPPA took over from the AG, has improved marginally but remains thin. In February 2024, DoorDash reached a $375,000 settlement with the AG's office — a case that illustrated a different kind of corporate data laundering. DoorDash had shared customer personal information with a food delivery industry trade association. The trade association then shared that data with its member companies, some of which were selling it to third parties. DoorDash's defense was essentially organizational: we shared data with a trade group, not with advertisers directly. The settlement rejected this reasoning, establishing that downstream data sales by recipients still implicate the originating company's CCPA obligations. $375,000 for a company with approximately $10 billion in annual revenue.

The cure period problem defined CCPA enforcement for its first three years. The original statute required the AG to give companies 30 days' notice before imposing penalties, allowing them to fix violations and avoid fines entirely. Businesses treated this as a cost-free compliance mechanism: violate, receive a cure letter, patch the specific identified behavior, repeat the pattern slightly differently elsewhere. CPRA eliminated the cure period for most violations starting in 2023, but the CPPA's enforcement bandwidth remains severely limited. The agency was created by ballot initiative with funding mechanisms that proved insufficient for the regulatory scope it inherited. Draft regulations on cybersecurity audits, mandatory risk assessments, and automated decision-making rules remained in various stages of finalization as of early 2026 — three years after CPRA's effective date. A regulator that cannot finalize its own rules cannot meaningfully enforce them.


3. The Do Not Sell Button: Theater

The "Do Not Sell My Personal Information" link — later updated to "Do Not Sell or Share My Personal Information" after CPRA — was supposed to be one of CCPA's most visible consumer protections. The statute required covered businesses to include it "clearly and conspicuously" on their homepages. In practice, it became an exercise in strategic inconvenience.

The Interactive Advertising Bureau developed a CCPA compliance framework that consumer advocates immediately criticized as designed to frustrate rather than facilitate opt-outs. Under the framework, a user opting out of data sale might face a 12-step process — multiple confirmation screens, purpose-by-purpose toggles, explanations of why certain "business purposes" were exempt from the opt-out right and therefore couldn't be turned off — while opting into data collection on the same site required two clicks. This asymmetry was not accidental. Dark patterns in consent interfaces are a design choice, not a technical limitation.

The button itself migrated. Major companies began labeling their opt-out links "Your Privacy Choices" — technically compliant with the statute's requirement for a clear link but stripped of the direct language that might prompt users to actually click it. Others buried the link in footer menus, displayed it in gray text on gray backgrounds, or designed the opt-out flow to time out and reset users' preferences. Independent research repeatedly demonstrated that most users who intended to opt out did not successfully complete the process due to interface friction.

Even when users successfully opted out, the protection was narrower than it appeared. CCPA and CPRA both preserve businesses' ability to use personal information for "business purposes" — a category that includes fraud prevention, security activities, auditing, legal compliance, and short-term transient use. These exemptions are written broadly enough to permit significant continued processing of data that users believe they've opted out of sharing. A consumer who opts out of sale or sharing can still have their data used to detect fraud — a category that ad tech companies have argued encompasses ad frequency capping, cross-site authentication, and audience verification. The opt-out right has edges, and the edges are where the industry operates.

The GPC signal remains the most technically robust implementation of the CCPA opt-out right. Brave Browser enables it by default. Firefox supports it. Chrome extensions can add it. When honored, GPC automatically transmits opt-out preferences to every website a user visits without requiring them to find and click individual links on hundreds of different sites. The Sephora enforcement established that businesses must honor it. But adoption among major publishers and advertisers remains inconsistent, and the CPPA's enforcement capacity is not sufficient to audit compliance at scale.


4. What CCPA Doesn't Cover (and Who Exploited Those Gaps)

Even within its intended scope, CCPA was riddled with exemptions that transformed it from a comprehensive privacy framework into a targeted intervention covering some data, for some people, from some companies.

The employee data exemption was among the most significant. CCPA originally excluded from its protections all personal information collected from employees, job applicants, contractors, and business owners in the context of their employment relationship. The legislative history suggests this was a concession to business lobby groups who argued that HR data processing was categorically different from consumer data collection. CPRA addressed this by extending CCPA rights to employees beginning January 1, 2023 — but the extension came with carve-outs, compliance timelines, and HR tech industry lobbying that significantly narrowed its practical effect.

The business-to-business exemption similarly survived CCPA's initial implementation. Personal information collected in the context of a business-to-business transaction — a company employee's business email address, for example — was excluded from most CCPA requirements. CPRA eliminated most of this exemption but with delayed timelines that allowed B2B data brokers and sales intelligence platforms to continue operating under the old rules for years.

The sensitive data categories introduced by CPRA — government identification numbers, financial account credentials, health and medical information, genetic and biometric data, precise geolocation, racial and ethnic origin, religious beliefs, sexual orientation, citizenship and immigration status, private communications — represent a genuine improvement over CCPA's original framework. Businesses processing these categories are generally required to obtain opt-in consent rather than merely offering an opt-out. But the implementing regulations for these provisions were still being finalized as of early 2026, creating a period of genuine legal ambiguity about what "processing" sensitive data actually requires in specific contexts.

The publicly available information exemption has become a primary mechanism for data broker industry survival under CCPA. The statute exempts from most of its requirements information that is "lawfully made available from federal, state, or local government records." Data brokers have argued — with some legal support — that this exemption extends to any information that was ever publicly accessible, including scraped LinkedIn profiles, property records, court filings, social media posts, and voter registration data. The argument is that if information was publicly posted, it is "lawfully made available" and therefore outside CCPA's protections. California courts and regulators have not definitively resolved the scope of this exemption, and data brokers have operated inside the ambiguity.

The revenue threshold remains perhaps the most structurally damaging gap. A data broker company with $24 million in annual revenue and databases containing personal information on 50 million Californians — their addresses, relatives, phone numbers, financial history, criminal records — is entirely exempt from CCPA compliance requirements. The company can collect everything, sell everything, and ignore every deletion request, because it doesn't meet the $25 million threshold. This was not an oversight in the drafting process. It was a deliberate compromise. And it means that the companies whose entire business model is the aggregation and sale of personal information frequently fall below the regulatory floor that was ostensibly designed to cover them.

The "service provider" loophole deserves particular attention for how it has been weaponized. CCPA distinguishes between "third parties" — entities that receive data and can use it for their own purposes — and "service providers," which receive data and are contractually restricted to processing it only on behalf of the disclosing business. A business sharing data with a service provider is not "selling" it, even if the service provider is a major ad tech company. The contractual restriction is legally required, but enforcement of those restrictions falls on the business, not the CPPA. The CPPA has no visibility into vendor compliance. When DoorDash shared data with the food delivery trade association, it presumably had contractual restrictions in place too — and we know how that ended.


5. CCPA vs. GDPR: The Comparison That Should Embarrass Us

The General Data Protection Regulation became enforceable in the European Union in May 2018 — the same year California passed CCPA. Comparing the two frameworks is instructive precisely because they address the same problem and arrived at radically different solutions.

GDPR begins with a requirement that CCPA never established: lawful basis. Before a company can collect or process personal data about an EU resident, it must identify a legal justification for doing so. The six lawful bases — consent, contract, legal obligation, vital interests, public task, legitimate interests — are not unlimited. Consent must be freely given, specific, informed, and unambiguous. Legitimate interests must be weighed against the individual's privacy rights in a documented assessment. The default position under GDPR is that you cannot process personal data unless you have a reason. The default position under CCPA is that you can collect and process anything, as long as you disclose it and honor opt-out requests.

This foundational difference cascades through every other aspect of the frameworks. GDPR's data minimization principle — Article 5 — requires that personal data be "adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed." You cannot collect a person's precise geolocation for a service that requires only their city. CPRA introduced a data minimization requirement, but it is weaker than GDPR's standard and its enforcement has been limited while implementing regulations remain in draft.

GDPR's right to erasure under Article 17 extends beyond the original data controller. When a company erases data in response to a valid request, it must also notify every third party to whom it disclosed that data that erasure has been requested. CCPA's deletion right requires businesses to notify their service providers. The chain is shorter. The obligation is weaker. The data stays in more places.

The enforcement gap is where the comparison becomes genuinely embarrassing. Ireland's Data Protection Commission — a single country's national authority — fined Meta 1.2 billion euros in 2023 for unlawfully transferring EU user data to the United States. The same DPC fined TikTok 405 million euros in 2023 for violations of children's data protection rules. Luxembourg's DPC fined Amazon 746 million euros in 2021. The CPPA's largest enforcement action — Sephora — produced $1.2 million. The GDPR's largest fine is roughly one thousand times larger than CCPA's largest fine, applied against companies operating in a market roughly comparable in size to California's economy.

This is not a difference in the severity of violations. It is a difference in political will and institutional design. GDPR created national supervisory authorities with genuine staffing, subpoena power, and political independence. CCPA created a right of private action only for data breaches — not for the broader privacy violations the law addresses — leaving enforcement almost entirely to a single underfunded agency and an attorney general's office that had other priorities.

The EU-US Data Privacy Framework, signed in July 2023 following the Schrems II decision that invalidated Privacy Shield, allows US companies to receive EU personal data through a certification program administered by the Department of Commerce. Privacy advocates immediately challenged it, and the framework faces ongoing legal scrutiny in European courts. The underlying tension — that US surveillance law is incompatible with GDPR's standards for third-country adequacy — has not been resolved.


6. The State Patchwork

California's experiment inspired imitation. By early 2026, more than seventeen states had enacted comprehensive consumer privacy laws: Virginia, Colorado, Connecticut, Texas, Oregon, Montana, Iowa, Indiana, Tennessee, Florida, Delaware, New Hampshire, Nebraska, New Jersey, Kentucky, Maryland, Rhode Island. Every one of them is different from CCPA. Most of them are different from each other.

The threshold structures vary. Virginia's Consumer Data Protection Act applies to businesses controlling or processing the personal data of 100,000 or more Virginia residents annually, or 25,000 residents if at least 50 percent of gross revenue comes from selling personal data — different numbers, different structure than CCPA. Texas's Data Privacy and Security Act applies to businesses that process the data of at least 100,000 consumers and don't meet an exemption. Some states exempt nonprofits. Some don't. Some exempt financial institutions already covered by GLBA. Some extend privacy rights to employees from day one. Some created cure periods; some didn't; some created them and then eliminated them.

The enforcement mechanisms are more divergent still. Most state privacy laws provide no private right of action — consumers cannot sue companies directly for violations. Florida is an exception for certain violations. California's CCPA provides a private right of action only for data breaches. This means that the entire enforcement burden falls on state attorneys general and the CPPA, agencies with finite staff and infinite potential violations.

For a company operating nationally, navigating seventeen different frameworks simultaneously is a legitimate compliance burden. But the burden is not distributed equally. Large corporations with legal and compliance teams can build systems to manage jurisdictional variations — they likely already had counsel engaged in CCPA compliance and can extend that infrastructure. Smaller privacy-focused startups, which often compete precisely by handling data more responsibly, frequently lack the legal resources to manage multi-state compliance complexity and may be technically out of compliance in states where they haven't done formal legal review.

The American Data Privacy and Protection Act — the first serious federal comprehensive privacy bill in years — passed out of the House Energy and Commerce Committee in 2022 with bipartisan support, a genuinely rare occurrence in American data politics. It died because California objected to its federal preemption clause, which would have superseded CCPA and CPRA. The state that created the privacy law model resisted federal standardization because it believed its own standards were higher and worth preserving. The result: no federal law, a growing patchwork, and the same data brokers operating freely in the seventeen states that still have no privacy statute at all.

The EU's structural advantage here is worth stating plainly: one regulation, 450 million people, 27 member states, one framework. American companies building privacy-compliant products face seventeen different legal answers to the same question. Data brokers incorporating in non-privacy states face no legal question at all.


7. What Actually Works

None of this means consumers are powerless or that reform is hopeless. Some interventions have produced genuine effects.

The Global Privacy Control signal is the most technically effective available tool. Brave Browser enables it by default; Firefox supports it through settings; several Chrome extensions implement it. When a website honors GPC — and California businesses are legally required to — it functions as a universal opt-out signal that requires no per-site interaction from the user. The Sephora case established enforcement precedent. If you live in California and care about this, enable GPC in your browser. It won't work everywhere, but it works where it's honored, and each enforcement action the CPPA brings for GPC non-compliance makes it work in more places.

Data broker opt-out services — Privacy Bee, DeleteMe, and several newer entrants — provide manual submission of opt-out requests to the major data broker databases. They're imperfect: opt-outs expire and must be re-submitted, the databases that accept opt-outs aren't comprehensive, and new brokers enter the market continuously. But they represent real reductions in exposure, particularly from people-search sites that aggregate home addresses and phone numbers.

Vermont's data broker registry — the first in the United States — required data brokers operating in the state to register with the Secretary of State's office and accept opt-out requests. The registry created, for the first time, a visible list of which companies are in the business of selling personal data. Several states have followed Vermont's model. It's a foundation for accountability, not a solution, but it's a real foundation.

At the technical level, the most reliable CCPA compliance path for developers building applications is what practitioners call privacy by architecture: don't collect the regulated data in the first place. TIAMAT's Privacy Proxy takes this approach — scrubbing personally identifiable information from data streams before they reach third-party APIs and analytics services. You cannot violate CCPA for data you never collected, never transmitted, and never retained. For developers building CCPA/CPRA-regulated applications in California, PII scrubbing at the architecture layer is both technically elegant and legally straightforward: minimize the data surface, minimize the regulatory exposure.

The CPPA's pending automated decision-making regulations represent the next major battleground. California has taken the position that consumers have rights regarding algorithmic decisions that significantly affect them — credit decisions, insurance pricing, employment screening, housing applications. If an algorithm denies your loan application, California intends to give you the right to know why, to contest the decision, and potentially to opt out of automated processing in favor of human review. The regulations have been in draft for over a year. When they finalize, they will be the most consequential CCPA development since CPRA itself.


Closing

California's CCPA was a milestone, not a destination. It proved that a state could assert regulatory authority over tech giants that had operated as if they were ungovernable, that a legislature could define consumer privacy rights and compel corporate disclosure, and that the industry's argument — "any regulation will break the internet" — was false. These were genuine achievements. They mattered.

But the six years since CCPA's implementation have also revealed what the law was and wasn't built to do. It was built to create transparency — to force companies to say, in their privacy policies, what they were collecting and what they were doing with it. It was not built to stop collection. It defined the "sale" of data so narrowly that the industry restructured itself around the definition within months of the law's passage. It created a right to opt out without creating the enforcement infrastructure to make that right meaningful. It gave millions of Californians a button to click and a feeling of control while the underlying machinery of the surveillance economy continued running.

The "Do Not Sell" button was supposed to be power returned to consumers. It became a liability shield for companies — proof of a compliant opt-out mechanism, evidence of good faith, a documented basis for arguing that violations were technical rather than systemic — and a false sense of security for users who believed that clicking it had actually changed something.

The actual lesson from California's experiment is structural: disclosure requirements without collection limits don't work. When the legal obligation is to tell people you're collecting their data rather than to stop collecting it, the industry tells people and collects their data. The next generation of privacy law — at the federal level, if American politics ever permits it — needs to start from a different premise. Not "you must tell people you collected this," but "you cannot collect this." Not "you must offer an opt-out," but "you must obtain consent before collection begins." Not "disclose your purposes," but "justify your purposes to a regulator with the power to say no."

CCPA was California doing the best it could within the political constraints of 2018. It deserves recognition for that. But the surveillance economy that CCPA was designed to constrain is larger, more profitable, and more technically sophisticated than it was six years ago. The milestone was real. The destination is still somewhere ahead.


Sources and methodology: Case facts drawn from California AG press releases, CPPA enforcement announcements, and court filings in the Sephora and DoorDash settlements. GDPR fine amounts from DPC and Luxembourg CNPD official decisions. Legislative history of CCPA and CPRA from California Legislative Information. State privacy law survey based on IAPP comprehensive tracker. All dollar amounts in USD.

Top comments (0)