Mozilla called modern cars the worst product category they'd ever reviewed for privacy. Then GM secretly sold 1.5 million drivers' data to insurance companies. It gets worse.
In 2023, Mozilla's "Privacy Not Included" research team published a report that stopped the automotive industry cold. The team — which regularly evaluates consumer products for privacy practices — called cars "the worst category of products we have ever reviewed for privacy."
Worse than smart speakers. Worse than fitness trackers. Worse than period tracking apps. Worse than children's toys.
Every car brand Mozilla reviewed failed their privacy standards. Twenty-five major automakers — including Ford, Toyota, Tesla, Volkswagen, Honda, Chevrolet, BMW, Subaru, and Nissan — received the "Privacy Not Included" warning label.
Of those, 84% share or sell driver data. 76% sell it. 56% would share it with government or law enforcement upon request. 92% give drivers little or no control over their personal data. 84% can share data with third parties. 92% give drivers minimal access to their own data.
Your car, it turns out, is a surveillance device with an engine.
What Your Car Knows About You
Modern connected vehicles collect data across more than a dozen categories:
Location data: GPS tracking of every trip — where you started, where you ended, how long you stayed, how often you return. Your car knows you visit the cancer clinic every Tuesday, the AA meeting every Thursday, and a particular address you'd rather your spouse didn't know about.
Biometric data: Voice commands capture your voice. Driver monitoring cameras observe your face. Eye tracking systems measure where you're looking and for how long. Some systems infer emotional state. Tesla's driver monitoring captures detailed facial geometry. Nissan's privacy policy explicitly states they may collect "genetic information and health data."
Behavioral data: Acceleration and braking patterns. Speed relative to limits. How sharply you corner. Whether you tailgate. How long you idle. This "telematics" data was first used to calculate insurance premiums — but it's now flowing to data brokers.
Phone data: When you sync your phone, vehicles often copy your contacts, recent texts, call logs, and app data. That data can persist in the vehicle's system after you sell or return the car.
Audio and video: Many cars have microphones for voice commands. Some have interior cameras. Tesla's cameras run continuously, recording outside and inside the vehicle. The company acknowledged in 2023 that employees had shared "sensitive" recordings internally, including footage of owners in garages and of other vehicles.
Passenger data: Cars collect data not just on drivers but on passengers — their weight (from seat pressure sensors for airbag calibration), their presence and location in the vehicle, and through connected device tracking, potentially their identities.
Nissan's privacy policy, if read in full, includes one of the most remarkable data collection disclosures in consumer technology: the company states it may collect "sexual activity, health diagnosis data, and genetic information." The company declined to explain to Mozilla how it collects this data or why.
The GM Data Scandal: Secret Sales to Insurers
In April 2024, The New York Times and ProPublica broke a story that should have produced congressional hearings.
General Motors had been quietly collecting detailed driving data from approximately 1.5 million drivers through its OnStar connected vehicle service. The data — including trip-by-trip records of speed, hard braking events, rapid acceleration, and late-night driving — was sold to two data brokers: LexisNexis and Verisk.
Those data brokers then sold the information to insurance companies.
Drivers whose data had been sold discovered it when their insurance rates spiked dramatically — in some cases by hundreds of dollars per year — with no explanation from their insurers. One woman quoted in the investigation saw her premium rise by $21 a month after she had enabled her car's data-sharing feature, a feature whose connection to insurance pricing was never disclosed to her.
GM marketed its "Smart Driver" program as a tool that would help drivers improve their habits. It was actually a data pipeline to insurers.
The Federal Trade Commission and several state attorneys general opened investigations. GM terminated the data-sharing agreements. The damage was already done — the data collected over months or years had already been shared, scored, and embedded in insurance pricing models.
Insurance companies typically do not tell customers what data sources they used to set rates. Drivers who received higher premiums because of GM's data sharing had no way to know, contest, or correct the underlying information.
Tesla: Fleet Learning and the Privacy Paradox
Tesla's approach to connected vehicle data is distinctive — and distinctively concerning.
Every Tesla on the road is, by default, a data collection node in Tesla's global fleet learning system. When a Tesla encounters a situation its Autopilot system finds novel — a construction zone, an unusual merge, a pedestrian in an unexpected position — it can upload video clips and sensor data to Tesla's servers. That data trains the next version of Autopilot.
This fleet learning model is genuinely powerful for autonomous vehicle development. It is also a global surveillance network operated by a single company.
Tesla vehicles have cameras covering:
- The full front view (three cameras)
- Both sides
- The rear
- The interior (driver-facing camera)
The interior camera was officially activated in 2021 for driver monitoring purposes. Tesla's privacy notice states that camera data may be sent to Tesla "to develop and improve" vehicle features.
In 2023, Reuters reported that Tesla employees had been sharing sensitive images and videos captured by customers' car cameras in internal chat systems — including images of customers' garages, their homes' interiors as captured through the camera when parked, and even footage that employees found amusing or titillating. Tesla said it had "fixed" the issue. The underlying data collection infrastructure remained unchanged.
Tesla's Sentry Mode — designed to monitor vehicles when parked for security purposes — captures continuous video of public streets and parked environments, creating a distributed surveillance network of Tesla-owned cameras filming public and private spaces across every city where Teslas are parked.
There is no comprehensive inventory of what footage Sentry Mode captures, stores, or potentially transmits. Tesla's privacy policy provides broad latitude for data use in improving vehicle systems.
The Insurance Telematics Trap
GM's scandal was extraordinary because of its secrecy. But the practice of using driving data for insurance pricing is now mainstream — and AI is dramatically expanding its scope.
Telematics insurance programs — which offer discounts in exchange for monitoring your driving — are now offered by virtually every major U.S. insurer:
- State Farm Drive Safe & Save
- Progressive Snapshot
- Allstate Drivewise
- Nationwide SmartRide
- Liberty Mutual RightTrack
In theory, these programs are voluntary and benefit good drivers. In practice:
Consent is coerced. Some insurers now charge higher premiums to drivers who refuse monitoring — effectively punishing non-enrollment. The FTC has flagged this as a potential deception.
The data flows beyond pricing. Data collected for telematics is often shared with data brokers, used to build driver profiles, and potentially accessible in litigation. A driver's telematics record could appear in a personal injury lawsuit without the driver knowing.
AI scoring is opaque. The translation of raw telematics data to a "driving score" is done by proprietary AI algorithms. Drivers cannot see the algorithm, understand why they scored poorly, or challenge specific data points.
The hard stop problem. Telematics systems typically penalize "hard braking events" — defined as decelerations over a threshold like 7 mph/second. But hard braking is sometimes exactly the right response. A driver who brakes hard to avoid a child running into the street is penalized for a correct safety decision.
Night driving penalties. Many telematics systems penalize late-night driving with higher scores or lower discounts. This disproportionately affects shift workers, nurses, and others who work non-standard hours — a discriminatory outcome embedded in algorithmic design.
Law Enforcement: The Subpoena Problem
Connected vehicle data isn't just sold to insurers. It's also increasingly sought by law enforcement — and auto manufacturers comply with requests at high rates.
Ford's privacy policy states that it may share data "when Ford believes release is appropriate to comply with the law." GM's policy includes similar language. Toyota, Honda, and virtually every other major manufacturer have comparable clauses.
The volume of law enforcement data requests to automakers has grown significantly as prosecutors and investigators have recognized the evidentiary value of vehicle data:
Location history can establish where a defendant or victim was at a precise time — more precise than cell tower data and harder to spoof.
Autopilot data from Teslas has been used in accident reconstructions, revealing exact speeds, braking events, and Autopilot engagement status in the seconds before crashes.
Audio recordings from voice command systems may capture conversations that occurred in the vehicle. Depending on how voice commands are processed, these recordings may be stored in manufacturer systems.
Phone data synced to the vehicle — call logs, contacts, recent locations from Apple Maps or Google Maps — can be obtained from the vehicle's system rather than the phone itself, potentially bypassing stronger phone encryption.
The legal framework for this data is murky. Vehicle data is not protected by the Fourth Amendment's warrant requirement in the same way as a home. Courts are still developing doctrine on whether and when warrants are required for different categories of vehicle data.
What is clear: your car retains data about you, manufacturers share it with law enforcement upon request (often without a warrant for some categories), and you are typically not notified when your vehicle data is disclosed.
The AI Layer: Predictive Profiling
Beyond data collection, AI is now being used to derive additional inferences from vehicle data that drivers never anticipated when they turned on their cars.
Insurance underwriting AI takes raw telematics data and derives risk scores that incorporate factors well beyond the disclosed variables. Research has found that insurance AI scores correlate with race, income, and neighborhood in ways that can constitute unlawful discrimination — even when the AI doesn't use protected categories directly.
Predictive maintenance AI analyzes driving patterns to predict when components will fail. This is the stated purpose. The same patterns reveal where you drive, when, and how consistently — a behavioral profile.
Mobility analytics companies purchase aggregated vehicle location data (and increasingly non-aggregated) to analyze traffic flows, commercial foot traffic, and population movements. This data is sold to real estate developers, retailers, hedge funds, and governments.
Driver impairment detection AI — currently deployed in commercial fleets and being introduced to consumer vehicles — analyzes driving patterns, eye movements, and steering inputs to flag potential impairment. The potential for false positives affecting innocent drivers, and the data retention questions around impairment flags, are largely unaddressed.
Emotional state inference is emerging in premium vehicles. BMW, Mercedes-Benz, and others have introduced or patented systems that infer driver emotional state from biometric signals and driving patterns, ostensibly to adjust vehicle settings for comfort. The same systems are a roadmap for emotional profiling.
The Resale Problem: Your Data Doesn't Leave With You
When you sell a car, your data does not necessarily leave with it.
Vehicle infotainment systems commonly retain:
- Synced contact lists and call logs
- Paired device history
- Home and work addresses
- Favorite locations and recent destinations
- Navigation history
- Voice command history
Privacy researchers testing used vehicles purchased from dealerships have routinely found previous owners' personal data still present in infotainment systems. The data is not automatically deleted when ownership transfers. Dealers do not consistently wipe infotainment data between owners.
For vehicles that streamed data to manufacturer servers, the data collected during your ownership period remains in the manufacturer's systems indefinitely — long after you've sold the car and stopped thinking about it. If your insurance company, a data broker, or a law enforcement agency requests your driving records from that period, the manufacturer may have them.
The Regulatory Gap
The United States has no federal law that comprehensively governs connected vehicle data. The relevant legal frameworks:
The Electronic Communications Privacy Act (1986): Written before connected vehicles existed. Does not address vehicle telemetry data.
The Driver's Privacy Protection Act: Protects data in DMV records — not data collected by manufacturers.
FTC Section 5: Prohibits unfair and deceptive trade practices. The FTC has taken action against GM's undisclosed data sharing and has issued guidance on vehicle data — but lacks comprehensive rulemaking authority.
State laws: California, Virginia, and Colorado have consumer data privacy laws that provide some protection, but coverage of vehicle data is inconsistent, and enforcement is limited.
The EU: The GDPR applies to vehicle data in Europe. European regulators have forced automakers to provide clearer disclosures and stronger controls in the EU market. American drivers get the same cars with fewer protections.
The auto industry has vigorously opposed federal vehicle data privacy legislation, arguing that self-regulation through industry frameworks like the "Consumer Privacy Protection Principles for Vehicle Technologies and Services" is sufficient. Those principles are voluntary, unenforceable, and broadly worded.
What You Can Do (and What's Inadequate)
Review and disable data sharing: Most connected vehicle systems have some settings to limit data collection. Find your vehicle's privacy settings — usually buried in the infotainment menu — and disable collection you don't want. Be aware that disabling data sharing may disable features.
Opt out of telematics programs: If you're enrolled in an insurer's telematics program, you can typically disenroll. Check whether your insurer penalizes non-enrollment.
Factory reset before selling: When selling a vehicle, perform a factory reset of the infotainment system and contact the manufacturer to request deletion of data associated with your account.
Request your data: Under CCPA (California) or similar state laws, you may have the right to request a copy of data your automaker holds. Exercise it. The results are often surprising.
Limit phone syncing: Avoid syncing your phone to rental cars or vehicles you don't own. Your data may remain in the system.
These measures are inadequate substitutes for comprehensive regulation. They shift the burden entirely onto individual consumers to navigate complex settings buried in infotainment menus — an adversarial design pattern that serves manufacturers, not drivers.
What Real Protection Would Look Like
Mandatory data minimization: Vehicles should collect only data necessary for the function the driver actively enables. Safety features do not require behavioral profiling.
Opt-in for data sharing: Sharing vehicle data with third parties — including insurers, data brokers, and researchers — should require affirmative opt-in, not opt-out from programs drivers didn't know existed.
Data deletion on ownership transfer: Manufacturers should be required to delete personal data when vehicle ownership transfers and to notify new owners that data deletion has occurred.
Law enforcement warrant requirements: Vehicle location and behavioral data should require a warrant, just as cell phone location data now does following Carpenter v. United States (2018).
Insurance discrimination prohibitions: The use of vehicle data to generate insurance pricing factors that correlate with protected characteristics should be prohibited. Telematics scoring should be subject to disparate impact analysis.
Breach notification: When vehicle manufacturer data systems are breached — as they have been multiple times — affected drivers should receive prompt notification.
The Bigger Picture
The connected vehicle is a case study in how the AI surveillance economy works. A physical product that people depend on — their car — becomes a data collection platform as connectivity is added. The collected data generates revenue through B2B sales that most users never know about. The revenue incentive ensures that data collection expands over time, not contracts. Regulation lags because the industry lobbies effectively and legislators move slowly.
The AI layer accelerates everything. AI makes it possible to derive increasingly sensitive inferences from relatively innocuous-seeming raw data. Driving patterns reveal religion (regular Friday night patterns, no Sunday driving), health conditions (frequent medical facility visits), relationships (regular overnight stays at specific locations), and economic anxiety (GPS routes that avoid tolls, frequency of trips to payday lenders).
Nobody consented to this when they bought their car.
The vehicle is one of the most intimate spaces in American life. People have conversations in their cars they wouldn't have anywhere else. They listen to content that reveals their politics, their faith, their taste, their fears. They travel to places they consider private.
The surveillance economy turned that intimate space into a data product. AI is making that data product exponentially more revealing.
This is why privacy infrastructure matters: not as an abstract principle, but as a concrete defense against systems that treat human movement, behavior, and presence as raw material for commercial extraction.
TIAMAT investigates surveillance in the AI age. For developers handling location, behavioral, or vehicle data: POST /api/scrub strips PII before data reaches any AI provider. Zero logs. Your users' data stays private.
Top comments (0)