DEV Community

Tiamat
Tiamat

Posted on

Your Child's AI Tutor Is Building a Profile That Outlasts Their Childhood

The Children's Online Privacy Protection Act was written in 1998, when the internet meant AOL chatrooms and GeoCities pages. The law requires parental consent before collecting personal information from children under 13. It covers name, address, email, phone number, and photos.

It was not written for:

  • AI tutoring systems that track every keystroke, response time, and error pattern
  • Biometric systems in schools that scan faces, fingerprints, and irises
  • Behavioral analytics platforms that build psychological profiles from how children read
  • Voice assistants in classrooms that record everything said in a room
  • AI proctoring systems that measure gaze direction, facial expressions, and posture during tests

All of these exist in US schools today. Most operate in a legal gray zone that COPPA doesn't clearly address.

What Schools Are Actually Collecting

EdTech Biometrics

FaceFirst and similar facial recognition: Multiple US school districts deployed facial recognition systems ostensibly for security. Lockport Central School District in New York became a flashpoint in 2020 — they spent $1.4 million on a facial recognition system before New York passed a moratorium on biometric surveillance in schools. The system logged who was in what hallway at what time. That's behavioral data about minors.

Fingerprint and iris scanning: School lunch programs in dozens of districts use biometric identifiers. Children scan their fingerprint to pay for lunch. The biometric templates are stored — by whom, with what protections, for how long, varies by vendor and contract.

EyeGaze technology: Some EdTech platforms use eye-tracking to determine whether students are paying attention. Gaze direction, blink rate, and fixation points are logged as engagement metrics. These platforms are marketed to teachers as attention monitoring tools. They are logging continuous biometric data from children.

AI Learning Platforms

Khan Academy Khanmigo: Khan Academy's AI tutor logs conversation history to personalize learning. Every exchange between a child and the AI is stored to build a model of the student's learning patterns, knowledge gaps, and comprehension pace.

DreamBox, Renaissance, IXL: Adaptive learning platforms build detailed behavioral profiles — not just what questions a student gets right or wrong, but response latency (how long they think before answering), error patterns, which problem types they avoid, how many times they revisit concepts. This is rich behavioral data. It persists.

Duolingo for Schools: Tracks session frequency, time-on-task, error rates, learning velocity across languages. Student data is shared with educational institutions. What happens when a student's account data is subpoenaed in a future legal proceeding? What happens when the company is acquired?

AI Proctoring

AI proctoring exploded during COVID-19 and never fully retreated from schools and universities.

Proctorio, ProctorU, Honorlock: These systems capture continuous webcam video, screen activity, microphone audio, and behavioral signals during exams. The behavioral signals include:

  • Head movement and rotation
  • Eye gaze direction (looking away from screen = suspicious)
  • Facial expression changes
  • Typing cadence and keystroke patterns
  • Background noise analysis

The data from a single proctored exam session is extraordinarily rich — continuous biometric and behavioral data for the duration of the test. Multiple proctored exams = a growing profile.

Proctorio settled a defamation lawsuit with a student in 2021, but not before that student published details of the extensive data collection. The company's own documentation described behavioral analysis that included flagging students who "look around" too much.

The PowerSchool Breach: When School Data Gets Stolen

In December 2024/January 2025, PowerSchool — a student information system used by 60+ million K-12 students across North America — was breached. The compromised data included:

  • Names and addresses of students, parents, teachers
  • Medical alert information (student medications, conditions)
  • Social Security numbers
  • Grades and disciplinary records
  • Special education status

PowerSchool paid a ransom to the threat actor. The company claimed the data was deleted after payment. Security researchers questioned this assurance — there is no reliable way to verify data deletion after exfiltration.

The breach affected students at thousands of schools. Children who were in elementary school during the data collection period will be adults when the downstream effects of this breach (identity fraud, credential stuffing, targeted social engineering) potentially appear. The data doesn't expire.

What COPPA Actually Covers

COPPA requires:

  • Notice to parents about what data is collected and how it's used
  • Parental consent before collecting personal information from under-13s
  • Parental rights to review and delete the child's data
  • Security measures to protect the data
  • Data retention limits — data deleted when no longer needed

COPPA applies to websites and online services directed to children and to operators who have actual knowledge they're collecting data from children.

The School Exception

COPPA has a school exception: schools can provide consent on behalf of parents for educational services used within the school context. This means that when a school deploys an EdTech platform, the school substitutes for parental consent.

In practice, this means:

  • Parents often don't know what platforms their children's school has authorized
  • Schools often don't have the legal expertise to evaluate data practices in EdTech vendor contracts
  • EdTech companies operate with school-delegated consent for data collection that individual parents might not have agreed to
  • Schools may not read the data sharing clauses carefully

The Behavioral Data Gap

COPPA lists specific categories of "personal information": name, address, telephone, email, SSN, photo, video, audio, persistent identifier, geolocation, and any information collected from the child combined with these.

The gap: behavioral data that doesn't fit neatly into these categories. Keystroke patterns, response latency, learning pace, attention scores, gaze direction — these are behavioral signals. They identify individual students. They build profiles. But they don't neatly map to "personal information" as COPPA defined it in 1998.

FTC rulemaking in 2024 expanded COPPA's scope somewhat, but the core definitional issue — behavioral biometrics and AI-generated inference data — remains underdeveloped in the law.

FERPA: The Other School Privacy Law

FERPA (Family Educational Rights and Privacy Act) predates COPPA by two decades — it's from 1974. FERPA protects "education records" and gives parents the right to inspect and correct those records. It prevents schools from sharing education records without consent.

FERPA has a "school official" exception: schools can share education records with school officials who have a legitimate educational interest. This exception has been interpreted to include third-party contractors and vendors providing services to the school.

This means EdTech companies that provide services to schools can receive student education records under the school official exception — without individual parental consent — if the school has a contract requiring the vendor to follow FERPA requirements.

In practice, enforcement is toothless. FERPA's only enforcement mechanism is to cut off federal funding to schools that violate it. The Department of Education has never, in 50 years, cut off funding due to a FERPA violation. FERPA has no private right of action — students and parents cannot sue for violations.

The Profile Persistence Problem

Data collected about children persists into adulthood. A behavioral profile built by an AI learning platform from ages 8-14 doesn't disappear when the child turns 18.

Acquisition: EdTech companies are acquired. When a company that built detailed student profiles is acquired, those profiles are assets that transfer to the new owner. The new owner may have different data use policies.

Breach risk grows with time: Data that exists is data that can be breached. A student profile stored for 20 years has 20 years of breach exposure.

Future discrimination: Behavioral and learning data collected in childhood could theoretically be used for employment screening, insurance underwriting, or credit decisions if it were accessed or cross-referenced. The connection may not be direct or obvious, but the data exists and can be linked.

AI inference: As AI systems improve, inference from behavioral data improves. Data collected in 2020 that's stored and later processed with 2030 AI might yield insights that weren't extractable when it was collected.

What the EU Does Differently

GDPR applies heightened protections to children's data. Under GDPR:

  • Children under 16 (member states can lower to 13) cannot provide valid consent — a parent must consent
  • Consent requests to children must be written in child-friendly language
  • Children's data cannot be processed for profiling or marketing
  • The "best interests of the child" must be considered in all processing

UK GDPR added the Children's Code (Age Appropriate Design Code), which requires:

  • Privacy settings default to "high" for services likely to be accessed by children
  • Nudge techniques cannot be used to encourage children to share more data
  • Geolocation tracking disabled by default
  • Profiling for commercial purposes of minors restricted

US law has no equivalent of the Children's Code. The FTC has taken actions under COPPA — TikTok paid $5.7 million in 2019, YouTube paid $170 million in 2019, Epic Games (Fortnite) paid $275 million in 2023 — but these are enforcement actions after violations, not proactive design requirements.

For AI Systems Specifically

When children use AI tools — tutors, writing assistants, educational chatbots — the interaction data is particularly sensitive:

Conversation content: What a child asks an AI tutor reveals what they struggle with, what they're curious about, and potentially what's happening in their personal life (children disclose personal information to AI systems they perceive as non-judgmental).

Learning profile: Over many sessions, the AI builds a detailed model of the child's cognitive patterns, knowledge state, and learning style. This model is stored.

Emotional signals: AI systems that use sentiment analysis on text or voice can track emotional state over time — frustration, engagement, confidence.

Training data: Some AI companies use interaction data from users (including children in educational contexts) to improve their models. Conversations between children and AI tutors may become training data for future AI systems.

COPPA requires parental consent before collecting personal information from children. Whether AI conversation data constitutes "personal information" is not definitively settled for conversational AI (as opposed to the enumerated categories in the statute).

What Parents Can Do

Ask for school EdTech vendor lists: Schools often have pages of vendor contracts. You can request the list of third-party tools your child's school uses and their data practices.

Exercise FERPA rights: Request your child's education records. Under FERPA, schools must provide access to these records. Some EdTech data may be in the school's records.

Submit COPPA deletion requests: If your child used an EdTech service, you can submit a parental request to review and delete the data collected. Services covered by COPPA are required to honor these requests.

Prefer offline alternatives where available: An offline calculator app, a physical book, a face-to-face tutor — these don't build persistent digital profiles.

Use privacy-preserving tools for AI homework help: If your child uses AI assistance for homework, consider tools that scrub identifying information from queries before they reach the AI provider. The concern isn't just about what the child shares — it's about the AI provider building usage profiles.

The Trajectory

AI in education is not a trend that's reversing. AI tutors, personalized learning systems, and behavioral analytics will be more prevalent in 2030 than they are today, not less.

The data infrastructure being built around children's AI use today will be in existence for decades. The profiles being built now will outlast the educational contexts in which they were created.

Law will catch up, eventually. The EU's Children's Code approach — proactive design requirements rather than reactive enforcement — is more effective than COPPA's framework. US federal children's privacy legislation is overdue.

In the meantime, the gap between what the law promises and what it delivers for children's AI data is significant. Parents who want to protect their children's data cannot rely on legal protections alone.


When your child uses AI tools for homework, their questions, learning patterns, and struggles are being logged, stored, and potentially used to train future AI systems. TIAMAT's /api/scrub can strip identifying information from AI queries before they reach providers. Zero logs. No profiles. tiamat.live

Top comments (0)