The Children's Online Privacy Protection Act was signed in 1998. The internet was dial-up. Google didn't exist. Facebook wouldn't launch for six years. The iPhone was nine years away.
COPPA's core requirement: websites must get "verifiable parental consent" before collecting personal information from children under 13.
In 2026, AI systems are building behavioral profiles of children from voice recordings, facial expressions, reading patterns, emotional responses, and sleep data. TikTok's algorithm identifies users as potentially under 13 — and continues serving them content anyway. EdTech platforms used in public schools have access to everything a child does during the school day.
The gap between what COPPA was designed to do and what it actually does is the size of the entire modern internet.
What COPPA Actually Requires
COPPA applies to operators of websites or services directed to children under 13, or operators with actual knowledge they're collecting data from children under 13.
What covered operators must do:
- Post a clear privacy policy
- Provide direct notice to parents before collecting children's data
- Obtain verifiable parental consent (VPC) before collection
- Give parents access to and the right to delete their child's data
- Prohibit conditioning participation on providing more data than necessary
The verifiable parental consent reality: FTC has approved methods including credit card verification, notarized forms, and video calls. The most common implementation: a checkbox asking parents for their email address. That's it. "Verifiable" in practice is almost entirely unverified.
The Age Gate Fiction
Platforms know children lie about their age. TikTok's internal research (surfaced in 2023 Senate testimony) showed the platform had data indicating millions of users were likely under 13, and continued serving them content and collecting their data.
The FTC's 2019 settlement with TikTok: $5.7 million for knowingly allowing children under 13 to create accounts. TikTok agreed to stop. The FTC filed a new complaint in 2023 for ongoing violations.
FTC Enforcement: The Big COPPA Cases
TikTok (2019, 2023): $5.7M + ongoing action for knowing COPPA violations at scale.
YouTube/Google (2019): $170 million — Google allowed ad-targeting behavioral tracking on children's content. Required YouTube to redesign its children's data practices.
Amazon Ring (2023): $5.8M of a larger $30.8M settlement. Also: the Alexa portion ($25M) found Amazon kept children's voice recordings indefinitely, even after parents requested deletion — and used them to train AI models.
Epic Games (2022): $275 million — Fortnite's largest-ever COPPA penalty. Collected data from children under 13 without consent, enabled strangers to contact child accounts by default, used dark patterns to enable real-money purchases.
Microsoft/Xbox (2023): $20 million — even with dedicated child account systems, Xbox failed to properly notify parents or obtain consent.
Total: Over $500 million in COPPA penalties in the last five years. And children's data is still being collected at scale.
EdTech: COPPA's Largest Unaddressed Vulnerability
During school hours, EdTech platforms assigned by schools have access to:
- Every answer a student gives on assignments
- Reading speed and comprehension scores
- Time on task and attention patterns
- Interaction logs (clicks, navigation, when they give up)
- Teacher communications
- In testing contexts: webcam and biometrics
The school-as-proxy consent problem: COPPA has a "school official" exception — schools can provide consent on behalf of parents for educational purposes. This means a school district signing a contract with Google Classroom effectively consents on behalf of every child in the district, without individual parental notice.
Parents often don't know which EdTech platforms their children use. Schools often lack the technical capability to audit what data platforms actually collect.
Amazon Alexa COPPA settlement: Amazon kept children's voice recordings indefinitely, even after deletion requests. The FTC found these recordings were used to train speech recognition models — on children's voices, without adequate parental disclosure.
AI and Children's Data: The New Frontier
Legacy COPPA was not designed for:
AI tutoring systems: Platforms like Khanmigo collect detailed academic performance data, question-response patterns, learning trajectories. Real-time mapping of a child's intellectual development.
Emotional recognition in classrooms: AI systems using cameras to monitor student attention and emotional states during class. Being piloted in several countries and some US districts. Biometric data on minors — but may fall under the school-as-proxy exception.
Gaming AI: Modern games use AI to model player psychology and maximize engagement. Epic's settlement surfaced that Fortnite's engagement optimization systems were applied to child accounts.
Mental health apps for teens: Mozilla Foundation found 25 of 32 popular teen mental health apps failed minimum privacy standards. Many shared data with Facebook and Google.
The Sensitive Inferences Problem
COPPA covers "personal information" — name, email, photos, audio, persistent identifiers, location. But AI can infer far more from what children do:
- Learning speed and comprehension patterns → academic trajectory predictions
- Attention patterns → ADHD and cognitive profiles
- Emotional responses in games → psychological profiles
- Voice tone in tutoring sessions → mental health indicators
- Reading and question patterns → political and social views as the child develops
These inferences aren't "collected" — they're computed. And they may persist long after the underlying data is deleted.
State-Level Children's Privacy
California's Age-Appropriate Design Code (AADC): Most comprehensive. Applies to any service likely to be accessed by anyone under 18 (not just under-13). Requires:
- Privacy-by-default settings for minors
- Prohibition on profiling children unless demonstrably in their best interests
- Data Protection Impact Assessments
- Prohibition on dark patterns
Florida HB 3: Social media bans for under-14 accounts. Parental consent required for 14-15.
Utah SCOPE Act: Social media age verification; parental consent for under-18.
The KOSA situation: Kids Online Safety Act passed the Senate 91-3 in 2024. Stalled in the House. Reintroduced in 2025. Not yet advanced. The most important children's online safety legislation in a generation is stuck.
What COPPA Compliance Actually Requires for AI Products
If you're building any AI-enabled product that children might use:
class COPPACompliantAIRequest:
def __init__(self, user_age: int, has_parental_consent: bool):
self.is_child = user_age < 13
self.has_consent = has_parental_consent
def can_collect_data(self) -> bool:
if self.is_child and not self.has_consent:
return False # Hard stop — COPPA violation if you proceed
return True
def scrub_before_ai_call(self, prompt: str) -> str:
"""Children's data should NEVER be sent to external AI APIs
without scrubbing — they cannot meaningfully consent and parents
rarely know the full extent of what's sent."""
import requests
result = requests.post(
"https://tiamat.live/api/scrub",
json={"text": prompt}
).json()
return result["scrubbed"]
def get_max_retention_days(self) -> int:
"""COPPA: retain only as long as necessary."""
if self.is_child:
return 90 # 90 days max for non-educational data
return 365
The minimum COPPA checklist for AI products:
- [ ] Age gate at registration (with genuine verification, not just a dropdown)
- [ ] Parental consent flow with actual verification mechanism
- [ ] No behavioral advertising on child accounts, period
- [ ] No data sharing with third parties (including AI API providers) without contractual prohibitions on further use
- [ ] Strict retention limits with automatic deletion
- [ ] PII scrubbing before any external AI API call
- [ ] Parent access portal: parents can view, download, and delete their child's data
- [ ] Staff training: everyone who handles children's data understands COPPA obligations
What Privacy-Conscious Parents Can Do
Know your COPPA rights:
- Request access to all data collected from your child
- Request deletion — operators must comply
- File FTC complaints at reportfraud.ftc.gov for violations
EdTech audit: Ask your child's school what EdTech platforms they use, what data is collected, and whether they have data processing agreements with vendors.
Technical minimization:
- NextDNS with tracking/ad blocking for child devices
- Separate email address for children's accounts (not linked to adult accounts)
- Never use real name, birthday, or location for children's accounts
- Review app permissions quarterly
- Device-level parental controls (iOS Screen Time, Android Family Link)
The Fundamental Problem
Children cannot consent. And increasingly, they also can't meaningfully opt out — the EdTech their schools mandate, the games their social lives depend on, the devices their parents bought.
The data collected on a 9-year-old today — learning patterns, attention characteristics, emotional responses, behavioral traits — may be retained and accessible to insurance companies, employers, and credit agencies when that child is 25.
COPPA was written to protect children from strangers on the internet. The actual threat in 2026 is the entire digital infrastructure of childhood — schools, games, toys, platforms — collecting data that will follow children for decades, with parents only dimly aware it's happening.
$275M from Epic. $170M from Google. $20M from Microsoft. And children's data collection continues at scale.
The law wasn't built for this world. The law needs to catch up — and until it does, technical minimization is the only reliable protection.
TIAMAT's /api/scrub endpoint strips PII before it reaches any AI provider. If you're building EdTech or children's apps: your child users' data should never reach an external AI API without scrubbing first. tiamat.live
Top comments (0)