The enforcement trend is clear. Epic Games: $275 million. Microsoft (Xbox): $20 million. YouTube: $170 million. These are the FTC's signal that COPPA enforcement is no longer theoretical.
If your platform has users (or could have users) under 13 — a game, a forum, an educational tool, a community app — COPPA applies to you. "We don't target kids" is not a defense once you have actual knowledge of under-13 users. In 2026, between state-level laws proliferating and the FTC's increased scrutiny of the gaming sector specifically, the "too small to worry about it" era is over.
This post covers what COPPA actually requires and how to implement it as engineering infrastructure — not legal theory, but the specific systems you need to build.
What COPPA actually covers
COPPA (Children's Online Privacy Protection Act, 15 U.S.C. § 6501 et seq.) applies to two categories of operators:
- Online services "directed to children under 13" — determined by the FTC based on subject matter, visual content, use of animated characters, music, and similar factors.
- Any operator with "actual knowledge" that they're collecting personal information from children under 13.
The "actual knowledge" standard is the one that catches most platforms. If a user tells you they're 10, you have actual knowledge. If your platform's content obviously attracts children (Minecraft mods, Roblox plugins, educational tools), the FTC may find constructive knowledge even without explicit statements.
Personal information under COPPA is broader than most developers expect. It includes: name, address, email, phone number, screen name that can identify a child, persistent identifiers (device IDs, cookies), geolocation data, photos or video or audio with a child's image or voice, and any information combined with the above that allows individual identification.
The 5 core COPPA obligations
1. Verifiable parental consent before collection
Before collecting, using, or disclosing any personal information from a child under 13, you must obtain verifiable consent from the parent. "Verifiable" means using a method reasonably calculated to ensure the person providing consent is actually the parent.
Critically: you cannot collect personal information from an under-13 user and then seek consent afterward. Consent must precede collection.
2. A COPPA-compliant privacy policy
Your privacy policy must clearly describe what personal information you collect from children, how you use it, whether you disclose it to third parties (and who those parties are), and how parents can review and delete their child's information. Linking to a general privacy policy buried in your footer is not sufficient.
3. Data minimization
You may not condition a child's participation in an activity on collecting more personal information than is reasonably necessary. If a child wants to play a game, you cannot require their birthdate, phone number, and photo as a condition of participation.
4. The parental dashboard: review, delete, and withdraw consent
Parents must be able to review what personal information you've collected about their child, delete it, and withdraw consent (with deletion following withdrawal). This requires a parental identity verification flow and a dashboard with real deletion capability — not soft deletion, but actual erasure from your systems and backups with a documented retention schedule.
5. Data security
You must maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children. The FTC expects encryption in transit and at rest, access controls, and documented security practices.
The age determination problem
This is the hardest engineering problem in COPPA compliance — and where most platforms fail.
Self-declaration ("enter your age") does not work. Children know to lie about their age. Self-declaration provides no protection against actual knowledge. If your platform attracts children and you're relying on users to truthfully report their age, you're exposed.
Age gates are a speed bump, not a barrier. Asking users to confirm they're 13+ before account creation provides minimal legal cover and no actual protection.
What actually works:
Option A: Default to children-first design. Implement COPPA's protections for all users. No personal information collection, no behavioral tracking, parental consent for account creation. This eliminates the classification problem at the cost of reduced functionality for adult users.
Option B: Age-neutral data collection. Don't collect personal information that would trigger COPPA from anyone. Increasingly common for smaller platforms.
Option C: Age verification gate with parental consent flow. Users who provide a birthdate indicating under-13 are routed to a parental consent flow before any personal information is collected. This works only if you treat any indication of under-13 status as triggering — including users who say they're 11 and then continue using the platform.
Verifiable parental consent: 6 FTC-approved methods
The FTC's COPPA Rule §312.5(b)(1) approves these consent mechanisms:
1. Signed consent form. Email the form, require a signed copy returned by mail, fax, or scan. Low conversion, high friction, legally bulletproof.
2. Credit or debit card verification. Use a card transaction with real-time notification to the cardholder. The assumption: only adults hold credit cards. Commonly implemented via Stripe with a $0.50 hold that's immediately refunded.
3. Toll-free number staffed by trained personnel. Parent calls, speaks with a human who verifies consent. High cost, doesn't scale, but explicitly FTC-approved.
4. Video conference. Live video session with trained staff. Same scaling constraints.
5. Government-issued photo ID with destruction guarantee. Parent submits ID, you verify age and destroy the image. High friction, significant data liability.
6. Knowledge-based authentication. The parent answers questions about their financial history or public records (similar to bank ID verification). Services like LexisNexis Risk Solutions provide these flows.
For internal operations only (no disclosure to third parties, no behavioral profiling), COPPA allows a simplified path: email plus confirmation. This is the lowest-friction compliant option for read-only or minimal-data platforms.
Practical recommendation for most indie platforms: implement credit card verification (Option 2) as your primary path, with signed form as fallback. Defensible consent with reasonable UX friction.
Data minimization in practice
Separate your data pipelines. Under-13 users must have data flows that exclude analytics, behavioral tracking, advertising profiling, and any third-party data sharing. If you're using a third-party analytics SDK, that SDK must not receive under-13 users' data — which means instrumenting your analytics to exclude COPPA-protected users, not just abstractly promising not to share their data.
Persistent identifiers are personal information. Device IDs, advertising IDs, session tokens that persist across sessions — all are personal information under COPPA when associated with a known or suspected child user. Your under-13 user architecture may need different identifier strategies than your adult user architecture.
Session isolation. Conversation data, gameplay data, and behavioral signals from under-13 users have different retention obligations. Build these as separate data classes with separate deletion triggers from the start.
The deletion obligation engineering spec
The parental review and deletion right requires real infrastructure:
Parental account linking — a system that cryptographically links the parent's verified identity to the child's account.
Complete inventory — you must know everything you've collected. If a child's data is in your analytics platform, your moderation logs, your behavioral model training sets, your backups, your CDN cache, and your primary database — your deletion flow must reach all of it.
Deletion vs. anonymization — COPPA requires deletion of personal information, but allows retention of de-identified aggregate data. Build your architecture to support true deletion of PII while preserving anonymized behavioral aggregates that don't re-identify.
Documented retention schedule — some data (moderation records, abuse reports, NCMEC CyberTipline evidence packages) may have legal retention obligations that override COPPA's deletion right. Document these exceptions explicitly.
The safety gap COPPA doesn't close
COPPA is a privacy and data handling law. It tells you what to collect, how to collect it, and how to delete it. It says almost nothing about what you must do to detect harm while that data is active.
This is the gap where children are most at risk. A grooming predator targeting a child on your platform:
- Generates data you're permitted to collect under COPPA
- Operates through conversations that appear benign in isolation
- Takes weeks or months to escalate — well after any individual message has been reviewed and cleared
COPPA compliance is necessary. It is not sufficient for child safety.
The platforms taking this seriously are building behavioral detection infrastructure alongside COPPA compliance — using the data they're permitted to collect to identify escalation patterns, relationship dynamics, and temporal signals that individual message review misses entirely.
State law: the layer on top of COPPA
COPPA is federal minimum. State laws add to it:
- California AADC (Age-Appropriate Design Code): Requires design choices that protect minors broadly (up to 18), including privacy by default, no profiling without opt-in, and accessible privacy controls.
- Utah, Arkansas, Texas, Florida: All passed laws in 2023-2024 adding parental consent or age verification requirements, some extending to 13-17 users.
- UK Children's Code: Applies to platforms accessible to UK users under 18. Requires privacy by default, no profiling, no nudge techniques.
These don't replace COPPA — they layer on top. Build your architecture to support user classification by age and jurisdiction with different data handling rules per class.
Implementation checklist
- [ ] Age gate with routing: under-13 routes to parental consent flow; 13+ proceeds normally
- [ ] Parental consent mechanism (credit card verification + signed form fallback)
- [ ] Parental account linking with cryptographic verification
- [ ] COPPA-segregated data pipelines (separate analytics, no third-party SDK data for under-13 users)
- [ ] Parental dashboard: review, delete, withdraw consent
- [ ] Documented deletion flow that reaches all data stores (primary DB, analytics, moderation logs, backups)
- [ ] Data retention schedule with legal exceptions documented
- [ ] NCMEC CyberTipline integration for mandatory reporting
- [ ] Privacy policy meeting FTC COPPA requirements (not your general privacy policy)
- [ ] Annual review cycle (requirements evolve; compliance must too)
SENTINEL provides reference implementations for several items on this list as open-source infrastructure: parental consent state tracking, COPPA-segregated data handling, GDPR/COPPA-compliant erasure that preserves audit log integrity, and NCMEC CyberTipline evidence package generation.
The behavioral detection layer — the part that identifies grooming patterns while COPPA-compliant data is active — is the other half of the picture.
GitHub: https://github.com/sentinel-safety/SENTINEL
Free for platforms under $100k annual revenue. Apache 2.0 in 2046.
Top comments (0)