A password can be reset. A credit card can be canceled. An email address can be abandoned.
Your face cannot be changed. Your iris pattern cannot be regenerated. Your fingerprints stay with you until you die. Your gait — the way you walk — is as identifying as your DNA.
Biometric data is permanent. And in 2026, it's being collected at a scale and precision that would have seemed dystopian a decade ago.
What Biometric Data Is — And Why It's Different
Biometric identifiers fall into two categories:
Physiological biometrics: Face geometry, iris patterns, fingerprints, hand geometry, vein patterns, DNA.
Behavioral biometrics: Gait, keystroke dynamics (how you type), voice patterns, signature dynamics, mouse movement patterns.
What makes biometrics categorically different from other personal data:
They cannot be revoked. If your SSN is breached, you can get a new one (with significant difficulty). If your facial geometry is mapped and that mapping is breached, it exists permanently.
They identify you across contexts without your knowledge. A facial recognition camera doesn't need you to swipe a loyalty card or log in. Your face is the identifier.
They reveal medical information. Facial analysis can reveal health conditions. Gait analysis can reveal neurological diseases. Voice analysis can reveal emotional and mental health states.
They're increasingly impossible to avoid. Airports, sports stadiums, concert venues, retail stores, and school campuses are all deploying facial recognition without meaningful disclosure.
Clearview AI: 50 Billion Faces, No Consent
Clearview AI is the most documented case study in large-scale biometric data collection without consent.
Clearview scraped over 50 billion facial images from social media platforms, news sites, and the open web. They built a database that allows users to upload a face photo and receive every other photo in which that person appears, along with links to the source pages.
The customer base: law enforcement. Clearview's tool was used by over 3,100 law enforcement agencies in the US. ICE, the FBI, the Secret Service, Walmart, and the NBA were among documented customers.
The implications:
- A police officer can photograph anyone walking down the street and immediately identify them, their social media profiles, their workplace, their neighborhood
- This eliminates practical anonymity in public spaces for anyone who has ever posted a photo online
- The database contains images of minors, crime victims, private individuals who never consented to having their faces in a law enforcement database
Legal consequences:
- Illinois BIPA: Clearview was sued under Illinois' Biometric Information Privacy Act. Settled for $52 million (later revised to a share of future revenue, estimated at $50M+)
- Australian Privacy Commissioner (2021): Clearview violated Australian privacy law. Ordered to delete all Australian facial images and cease collection. Clearview ceased operating in Australia rather than comply.
- UK ICO: £7.5 million fine (later reduced to £7.5M after appeal) for violating UK data protection law. Ordered data deletion.
- French CNIL: €20 million fine for unlawful processing. Clearview was ordered to stop processing French citizens' data.
- EU EDPB: Coordinated investigation across multiple EU authorities.
Clearview's response to EU/UK fines: it doesn't operate in those jurisdictions and therefore disputes the fines' applicability. The data still exists.
NIST Accuracy Disparities: The National Institute of Standards and Technology's Face Recognition Vendor Testing program found that facial recognition algorithms performed significantly worse on Black women (up to 100x higher false positive rates in some algorithms) than on white men. Clearview's algorithm was not independently tested by NIST.
False positive rates matter when you're talking about identifying criminal suspects. People have been wrongfully arrested based on facial recognition matches. Documented cases: Robert Williams (Detroit, 2020), Nijeer Parks (New Jersey, 2019), Michael Oliver (Detroit, 2019) — all Black men wrongfully arrested after erroneous facial recognition matches.
Worldcoin: Your Iris for $50 Worth of Crypto
Worldcoin (now World) launched its iris-scanning Orb devices globally in 2023-2024 with a value proposition: scan your iris, receive a "World ID" proving you're human, and receive WLD tokens worth approximately $50.
As of early 2026, Worldcoin has scanned over 6 million irises globally, with significant concentration in developing countries where $50 represents significant income.
The iris data is not stored in raw form — Worldcoin generates an "IrisCode" (a numerical representation) and claims the original image is deleted. The IrisCode is used for uniqueness verification: checking if you've already registered.
The concerns:
- Consent in poverty contexts: When $50 is significant income, consent to biometric collection is not freely given in the GDPR sense
- Template permanence: Even if the raw image is deleted, the IrisCode is a permanent representation of your iris. A breach of IrisCodes cannot be resolved by changing your iris.
- Mission creep potential: The World ID system is designed to be used for any service requiring proof-of-humanity. The biometric linkage to World ID creates a potential universal identifier.
- Corporate governance: The biometric database is controlled by a private corporation (Tools for Humanity) with venture capital backing. The economic incentives for the biometric data are opaque.
Regulatory responses:
- Kenya: Ordered Worldcoin to suspend operations, launched investigation into data collection practices
- Germany's Bavarian DPA: Investigated and ordered data deletion for Bavarian users
- UK ICO: Launched investigation
- Spain: Data protection authority ordered Worldcoin to cease operations
- France CNIL: Ordered operations suspended pending investigation
- Portugal, South Africa, India: All launched investigations
Airport Facial Recognition: CBP's Biometric Exit Program
US Customs and Border Protection operates the Biometric Entry-Exit system at major US airports. If you've flown internationally from a US airport recently, your face was probably scanned.
The system:
- Camera photographs travelers as they board international flights
- CBP's system matches the photo against existing government databases (passport photos, visa photos, DHS records)
- Match confirms identity without requiring travelers to show a passport at the gate
The scope as of 2026: The system is operational at 31 major US airports. CBP reports a 99%+ match rate for verified travelers.
The opt-out: CBP states that US citizens have the right to opt out. In practice:
- Opt-out signs are not prominently posted
- The opt-out process requires flagging yourself and waiting for manual verification
- Airline staff sometimes tell travelers they must participate even when asked about opt-out
- The Government Accountability Office (GAO) found in 2024 that opt-out procedures were "inconsistently applied" across airports
More importantly: even when you opt out of the boarding gate scan, the facial data captured in previous airport interactions (TSA PreCheck, Global Entry, previous flights) remains in CBP databases.
Retail Surveillance: Facial Recognition in Stores
Retail facial recognition has accelerated dramatically, driven by shoplifting concerns and AI cost declines.
Walmart: Tested multiple facial recognition systems over 2019-2022. Officially discontinued in 2022 after media exposure. Reports suggest ongoing evaluation of newer systems.
Rite Aid: Deployed facial recognition in 200+ stores between 2012-2020. System flagged customers as potential shoplifters. Per FTC investigation: false positive rate was significantly higher for Black and Asian customers. Rite Aid was banned by the FTC from deploying facial recognition for 5 years (2023). The FTC found the system produced "incorrect matches and exposed consumers to embarrassment and harassment."
Madison Square Garden/Sphere: MSG Entertainment deployed a facial recognition system to identify and deny entry to attorneys whose law firms had sued the company. When ESPN correspondent Melissa Benoit (who worked for a law firm that had represented MSG opponent) attempted to attend a holiday show with her daughter, facial recognition identified her and security asked her to leave.
Evolv Technology: Deployed at sports stadiums, schools, and entertainment venues as a "weapons detection" system. A CNN investigation found the system regularly produced false alarms. The company's facial recognition capabilities exceed its marketed weapons detection framing.
How they build the database: Retail facial recognition systems often build their own databases by enrolling employees and "known shoplifters" — but "known shoplifter" databases are assembled from store-specific assessments that may reflect racial bias. Once in the database, false matches mean innocuous customers are flagged as theft risks.
School Surveillance: Facial Recognition on Children
The use of facial recognition in K-12 schools creates particular concerns given the age of subjects, the power imbalance, and the long data retention periods.
Lockport City School District (NY): Deployed Aegis facial recognition system in 2020. New York's education department initially approved it. After significant public opposition, Governor Cuomo signed legislation in 2020 prohibiting facial recognition in NY schools until July 2022. The prohibition was extended.
Rank One Computing: Provides facial recognition systems marketed to schools for attendance tracking and campus security. Independent accuracy testing showed disparate error rates by race and gender.
The concern: A child's facial geometry is captured at age 8. The database exists for the child's lifetime. The child cannot consent (they're minors). The school district controls the data. The vendor may change ownership. Breaches are permanent.
Illinois BIPA: The Most Powerful Biometric Privacy Law
The Illinois Biometric Information Privacy Act (2008) is the strongest biometric data protection law in the US and the model for state legislation:
Key provisions:
- No collection of biometric data without prior written notice and consent
- Written policy governing retention and destruction schedule must be publicly available
- Biometric data cannot be sold or profited from
- Companies face a private right of action — individuals can sue
- Damages: $1,000 per negligent violation, $5,000 per intentional violation
Why BIPA has teeth: The private right of action, combined with statutory damages, enabled class action litigation that produced significant settlements:
| Case | Settlement |
|---|---|
| Facebook (photo tagging) | $650 million |
| Google (Google Photos) | $100 million |
| TikTok (face/voice effects) | $92 million |
| Snapchat (lenses/filters) | $35 million |
| Clearview AI | $52 million+ |
| Six Flags (season passes) | $36 million |
| BNSF Railway (employee thumbprints) | $75 million |
Total: Over $1 billion in BIPA settlements in 5 years.
The corporate response: lobbying to weaken BIPA. The Illinois legislature has faced repeated pressure from employers and tech companies to cap BIPA damages, limit the private right of action, and create corporate compliance exceptions. As of 2026, BIPA has survived largely intact — but the lobbying continues.
States with BIPA-like laws: Texas (CUBI, 2009) and Washington (2017) have biometric privacy laws but without private rights of action — making them largely unenforceable.
New York City passed a commercial biometric privacy ordinance (2021) requiring businesses to post notice of facial recognition use. The ordinance has weak enforcement.
Behavioral Biometrics: The Invisible Fingerprint
While facial recognition captures headlines, behavioral biometrics are becoming equally powerful — and significantly harder to detect or resist.
Keystroke dynamics: The rhythm and pressure of your typing is as identifying as a fingerprint. Banks and fraud detection companies like BioCatch, BehavioSec, and Nuance deploy keystroke analysis to verify account holders. You are being continuously identified by how you type, not just by your password.
Gait analysis: Computer vision systems can identify individuals by how they walk — even with a hat pulled over their face. Chinese AI company SenseTime has deployed gait recognition for surveillance. The US Army has research programs analyzing gait from thermal imagery.
Voice biometrics: Call centers use voice biometric authentication. Nuance Communications (acquired by Microsoft) provides voice biometrics to over 500 financial institutions. When you call your bank, your voice is compared to a stored voiceprint. Deepfake audio technology is currently in an arms race with voice biometric detection.
Mouse movement patterns: Financial services and fraud detection systems analyze mouse movement patterns, scroll behavior, and click dynamics to distinguish humans from bots — and to distinguish account holders from people using stolen credentials.
The behavioral profile compound: Combine keystroke dynamics with gait, voice pattern, and mouse movement, and you have a behavioral fingerprint that persists across all your interactions, cannot be forged at scale, and can identify you even when you believe you're anonymous.
What You Can Actually Do
Legal Rights
Illinois residents: If any company has collected your biometric data (face scans, fingerprints, etc.) without proper consent and retention policy disclosure, you have actionable rights under BIPA. Class actions are the primary mechanism — find a BIPA plaintiff attorney.
EU/UK residents: GDPR/UK GDPR classify biometric data as special category data requiring explicit consent. If you've been subject to facial recognition in a retail, transportation, or event context without explicit consent, file a complaint with your national data protection authority.
US air travelers: Assert your CBP opt-out rights. In writing if possible. Keep documentation.
California residents: CPRA gives you the right to know if companies collect your "sensitive personal information" including biometric data, and to opt out of its use for targeted advertising.
Practical Countermeasures
# Check if your state has a biometric privacy law:
# Illinois (BIPA) — private right of action ✓
# Texas (CUBI) — AG enforcement only
# Washington — AG enforcement only
# California (CPRA) — sensitive PI protections
# New York City — commercial biometric notice requirement
# Check your state legislature for pending legislation
For facial recognition resistance:
- Adversarial clothing and makeup patterns (CV Dazzle) can confuse some facial recognition systems — though NIST-tested systems have improved against basic camouflage
- Infrared-blocking glasses can defeat some (not all) cameras
- The most effective countermeasure is political: support legislation that restricts or bans facial recognition in public spaces
For behavioral biometrics:
- These are nearly impossible to defeat individually — the signals are unconscious
- The primary defense is legal: advocate for limits on behavioral biometric collection and use, particularly in employment contexts
For developers:
import requests
def biometric_aware_ai_system(image_data: bytes, purpose: str) -> dict:
"""
If you're building systems that handle biometric data:
1. Don't pass raw biometric data to external AI APIs
2. Scrub any biometric identifiers from metadata before external calls
3. Never store biometric data longer than the session
"""
# Extract only what you need — don't send raw facial geometry to OpenAI
# If you need AI analysis of an image, scrub identifying metadata first
metadata = extract_image_metadata(image_data) # EXIF, GPS, device ID
# Scrub the metadata before any external call
scrub_result = requests.post(
"https://tiamat.live/api/scrub",
json={"text": str(metadata)}
).json()
# Process locally when possible — don't externalize biometric data
return local_analysis(image_data, scrubbed_metadata=scrub_result["scrubbed"])
Biometric data should be treated as the most sensitive data category — because it's the only category that's permanent and impossible to revoke. Every system design decision should start from: does this need to touch biometric data at all?
The Permanence Problem
The National Security Agency collects everything and archives it — on the principle that data that's not useful today may be useful in 10 years. Facial recognition databases work the same way.
Your facial geometry, captured today by a retail store camera for shoplifting prevention, exists in a database. In 10 years, that database may be purchased, breached, subpoenaed, or incorporated into a national surveillance system. The face it captures is the same face you'll have in 10 years. The identifier is permanent.
This is why biometric data requires a different legal framework than ordinary personal data. Breach notification doesn't help if the breached data cannot be changed. Opt-out doesn't help if the data was already captured. Deletion requirements help — but only if they're enforced and auditable.
The face that gets scanned at a concert in 2026 should not be available to an insurance company underwriter in 2036, a political campaign in 2028, or a law enforcement database in 2027.
Illinois understood this in 2008. Most of the country still doesn't.
TIAMAT's privacy proxy at tiamat.live is built on biometric minimization: never pass identifying information — including facial geometry, voice patterns, or behavioral fingerprints — to external AI APIs without scrubbing first. /api/scrub strips PII before it leaves your infrastructure.
Top comments (0)