GDPR for EdTech: Protecting Student Data and Staying Compliant
EdTech platforms collect data on learners who cannot meaningfully consent. A ten-year-old cannot weigh up the trade-off between personalised learning recommendations and the long-term use of their behavioural profile. GDPR takes this seriously — and regulators are increasingly taking GDPR edtech enforcement seriously too. If your platform processes the data of children or young people in the EU or UK, you are operating in one of the most tightly regulated corners of privacy law.
This guide covers what GDPR edtech compliance requires: from lawful basis and parental consent to analytics restrictions, data retention, and the comparison with FERPA for platforms serving both markets.
What EdTech Platforms Actually Collect
Before examining the rules, it is worth being honest about the data. EdTech platforms typically collect:
- Learning progress data — lessons completed, time on task, mastery scores, module completion rates
- Assessment results — quiz scores, exam performance, written responses, graded submissions
- Engagement data — click-through rates, video watch time, re-attempt rates, drop-off points
- Parent and guardian information — names, email addresses, consent records, payment details
- Device and technical data — IP addresses, device identifiers, browser type, operating system, session timestamps
- Communications — messages between students and teachers, support tickets, feedback submissions
Much of this data is educationally necessary. But the aggregate picture — a granular, longitudinal record of a child's intellectual development, engagement patterns, and academic struggles — is sensitive in a way that goes beyond any individual data point. GDPR treats it accordingly.
Children's Data Under GDPR: Article 8
Article 8 of GDPR sets specific rules for information society services (which covers most EdTech platforms) when they process children's data based on consent.
Age thresholds matter. The default GDPR age is 16 — children under 16 cannot provide valid consent for digital services without parental or guardian authorisation. But Member States can lower this threshold to 13. The result is a patchwork:
- Most EU member states: 16 years
- Germany, France, Italy, Spain: 16 years
- Ireland, Denmark, Sweden: 13 years
- Netherlands: 16 years
- UK (post-Brexit, under UK GDPR): 13 years
For GDPR edtech platforms operating across multiple EU markets, this means you cannot assume a single age threshold applies. You need to either operate at the highest threshold (16) or implement jurisdiction-specific verification.
Age verification is required. GDPR does not specify how to verify age, but it requires that you make "reasonable efforts" to verify that a child has parental consent. Relying solely on a checkbox — or a self-reported date of birth — is unlikely to satisfy regulators. Reasonable efforts depend on the technology available and the nature of the service, but at minimum you should document your verification approach and why you consider it adequate.
Parental consent must be verifiable. Where a child is below the applicable age threshold, you need consent from a person with parental responsibility. This consent must meet the same standard as any GDPR consent: freely given, specific, informed, unambiguous, and withdrawable. The parent or guardian must be given a clear description of what data you will collect and how it will be used.
The Best Interests of the Child
Article 8(2) states that processing must be "in the best interests of the child." This is not just aspirational language — it is a substantive requirement that should inform every data processing decision you make about children's data.
The best interests principle means that when you are weighing whether to collect a particular piece of data, or how long to retain it, or whether to share it with a third party, the question is not just whether processing is technically lawful. The question is whether it serves the child's interests.
This has practical implications:
- Engagement data used to improve learning outcomes may serve the child's interests; engagement data used to optimise ad targeting does not
- Detailed behavioural profiles may help teachers intervene early with struggling students; those same profiles shared with data brokers do not
- Retention of progress data during an active subscription may be appropriate; retention of that data for 10 years after the child leaves the platform is not
Schools vs Direct Consumer EdTech
GDPR edtech compliance looks different depending on your business model.
School-as-controller model: Many EdTech platforms sell to schools, which then deploy the platform to students. In this model, the school is the data controller — it determines the purposes and means of processing. The EdTech platform is a data processor, acting on the school's instructions. The school is responsible for obtaining appropriate consent (or establishing another lawful basis), informing students and parents, and handling data subject rights requests. The EdTech platform's obligations are to process only as instructed, maintain security, provide audit trails, and sign a data processing agreement with the school.
Direct consumer model: Platforms that sign up individual students or parents directly are data controllers themselves. They bear full responsibility for lawful basis, consent management, privacy notices, data subject rights, and all other GDPR obligations. This is the harder compliance path, particularly when serving children.
The distinction matters because GDPR edtech enforcement has gone after both: schools for deploying non-compliant platforms, and platforms themselves for processing children's data without adequate safeguards.
Lawful Basis for EdTech Processing
Consent is the most obvious lawful basis for GDPR edtech, but it is not the only one — and for platforms deployed through schools, it may not be the primary one.
Consent (Article 6(1)(a)): Valid for direct consumer platforms when children are above the age threshold, or with parental consent for those below it. Consent must be specific to each processing purpose. You cannot obtain a single consent for "all data processing" — you need separate consent for, say, sharing data with third-party analytics tools.
Contract (Article 6(1)(b)): Applies when processing is necessary to perform a contract with the data subject (or, for children, their parents). For a subscription EdTech service, the contract lawful basis covers processing that is strictly necessary to deliver the service — storing progress data, enabling logins, generating reports.
Legitimate interest (Article 6(1)(f)): This is where GDPR edtech gets complicated. For adults, legitimate interest can be used for a wide range of processing activities where the business has a genuine need and the individual's rights are not overridden. For children's data, the legitimate interest test is significantly harder to pass. Regulators have made clear that you should not rely on legitimate interest where consent would be the appropriate basis, and that children's vulnerability increases the weight of their interests in the balancing test.
Legal obligation (Article 6(1)(c)): Some EdTech processing may be required by law — for example, maintaining assessment records under national education legislation.
Analytics and Profiling of Children: The Hard Cases
Analytics is where GDPR edtech compliance gets contentious. Platforms need data to improve their products. Personalisation engines need data to adapt learning paths. Dashboards for teachers need data to surface insight. But the rules on profiling children are strict.
Legitimate interest for analytics is harder. When you run a legitimate interest assessment (LIA) for analytics on children's data, the balancing test tips against you. Children are a vulnerable group. The information asymmetry between a child and a sophisticated technology platform is extreme. Regulators expect greater care.
Automated decision-making restrictions apply. Article 22 restricts solely automated decisions that produce significant effects on individuals. Using an algorithm to decide that a student needs intervention, to rank their academic performance, or to recommend educational tracks without human review could engage Article 22.
Pseudonymisation and aggregation reduce risk. Where possible, analytics should operate on pseudonymised or aggregated data rather than identified individual records. A teacher dashboard showing class-level trends is less privacy-invasive than one showing each student's individual engagement timeline.
Behavioural Advertising to Children: Generally Prohibited
This one is straightforward. Behavioural advertising — advertising based on tracking a user's online behaviour across websites and services — is not an appropriate processing activity for children's data under GDPR.
The Information Commissioner's Office (ICO) in the UK has been explicit: platforms covered by the Children's Code should not use children's data for targeted advertising. The same principle applies under EU GDPR. The legitimate interest basis does not support behavioural advertising to children, and obtaining valid consent from a child for advertising profiling is practically impossible given the age threshold requirements.
If your EdTech platform runs advertising, it should be contextual — based on the content being consumed — not behavioural. Any third-party advertising SDK that profiles users should be disabled for users identified as children.
Data Retention: Children's Learning Data Has a Shelf Life
The GDPR storage limitation principle requires that data is kept only as long as necessary for the purpose it was collected. For EdTech, this creates a specific obligation: children's learning data should not be kept indefinitely.
Guidance from regulators and the Article 29 Working Party suggests that:
- Active account data should be retained as long as the account is active and the service is being provided
- Data should be deleted or anonymised when a student leaves the platform or their account is closed
- Retention for product improvement or research should use aggregated, anonymised data — not individual identified records
- Any retention beyond the active subscription period requires a documented justification
The Children's Code in the UK includes a specific provision requiring that data is not retained for longer than necessary. GDPR edtech platforms should build automated deletion workflows triggered by account closure, particularly for child users.
Security: Extra Care for Children's Data
Article 32 requires "appropriate" technical and organisational security measures. For children's data, appropriate means more than the baseline. The sensitivity of the data — particularly when it includes assessment results, learning difficulties, or communications — requires elevated security standards.
Practical requirements include:
- Encryption at rest and in transit for all personal data
- Access controls limiting staff access to the minimum necessary
- Audit logs for access to student records
- Data isolation preventing mixing of child and adult user data
- Penetration testing and vulnerability management
- Incident response procedures specifically addressing breaches of children's data
- Vendor assessment for all third parties who receive student data
A breach of children's learning data is not just a regulatory problem — it carries reputational and ethical consequences that go beyond the financial penalties.
FERPA vs GDPR: Dual Compliance for EdTech Platforms in Both Markets
Many EdTech platforms operate in both the EU and the US. FERPA (the Family Educational Rights and Privacy Act) is the primary US federal law governing student data privacy — but it works very differently from GDPR.
| Dimension | GDPR (EU/UK) | FERPA (US) |
|---|---|---|
| Who it covers | Any platform processing EU/UK personal data | Schools receiving federal funding; applies to education records |
| Who bears the obligation | The platform (as controller or processor) | Educational institutions primarily |
| Basis for processing | Requires lawful basis (consent, contract, legitimate interest, etc.) | Permits processing without consent where school has "legitimate educational interest" |
| Children's consent | Age-gated parental consent required | Parental consent until child turns 18 |
| Individual rights | Right to access, erasure, restriction, portability | Right to access and amend education records |
| Penalties | Up to €20M or 4% of global turnover | Loss of federal funding; no private right of action |
The key practical difference: FERPA is largely permissive within the school context, while GDPR is prescriptive regardless of context. A platform that is FERPA-compliant is not necessarily GDPR-compliant.
For dual-compliance, EdTech platforms should treat GDPR as the higher standard and build FERPA compliance on top. The GDPR requirements — privacy notices, data minimisation, consent management, deletion rights — are broadly consistent with what FERPA-compliant institutions expect anyway.
Privacy Notices for Children: Plain Language Is Not Optional
GDPR requires that privacy information is provided in a "concise, transparent, intelligible and easily accessible form, using clear and plain language." For children, this has additional force — regulators expect notices that are actually comprehensible to the relevant age group.
The ICO's Children's Code requires "child-friendly" privacy information for services directed at children. This means:
- Short, visual formats rather than dense legal text
- Language calibrated to the youngest users of the service
- Layered notices — a short summary with links to more detail
- Avoiding legalese and jargon
A standard 3,000-word privacy policy in twelve-point legalese is not a compliant privacy notice for a platform used by eight-year-olds. You need a version that a child can actually understand.
Parental Rights: Access, Restriction, and Deletion
Parents (and children above the age of majority) have the full set of GDPR data subject rights with respect to children's data:
- Right of access (Article 15): Parents can request a copy of all personal data held about their child. You must respond within one month, for free, in a portable format.
- Right to erasure (Article 17): Parents can request deletion of their child's data. Grounds include withdrawal of consent, data no longer necessary for the original purpose, or objection to processing.
- Right to restriction (Article 18): Parents can request that processing is restricted while a dispute is resolved — for example, while contesting the accuracy of assessment records.
- Right to object (Article 21): Parents can object to processing based on legitimate interest. For children's data, where the legitimate interest test is already harder to pass, this objection right carries significant weight.
Your DSAR handling process must be able to identify and extract all data held about a specific child, across all your systems, within the one-month deadline.
GDPR EdTech Compliance Checklist: 8 Steps
Map every data processing activity involving children's data — document what data you collect, why, who has access, how long you keep it, and what third parties receive it. This is your foundation.
Establish and document lawful basis for each processing activity — contract for service delivery, consent (with age verification) for optional features, and remove any processing that relies on legitimate interest for core child-facing features.
Implement age verification and parental consent flows — build genuine verification mechanisms for under-age users and ensure parental consent meets the GDPR standard: specific, informed, unambiguous, and withdrawable.
Disable behavioural advertising and profiling SDKs for child users — audit every third-party tool integrated into your platform and ensure no behavioural tracking runs for users identified as children.
Write a child-friendly privacy notice — create a layered notice with a short, plain-language summary appropriate for your youngest users, plus a full notice for parents.
Build automated data deletion workflows — trigger deletion or anonymisation when a child's account closes, and enforce documented retention limits for all categories of student data.
Configure your security controls for elevated sensitivity — encryption at rest and in transit, access controls, audit logs, and incident response procedures specifically scoped to breaches of children's data.
Establish a DSAR process that covers parental requests — test your ability to identify, extract, and deliver all data about a specific child within 30 days, and document the process.
Scan Your Platform's Public Pages for Compliance Gaps
Your website and marketing pages are where regulators and parents look first. Cookie banners that fire before consent, third-party trackers embedded in your homepage, or privacy notices that do not mention children's data — these are visible, verifiable compliance failures.
Custodia scans your public-facing pages to surface the trackers, cookies, and data flows that a regulator or concerned parent would find. It also generates privacy policy content based on what your site actually does — not a generic template.
Scan your EdTech platform for compliance gaps at Custodia →
Last updated: March 27, 2026. This post provides general information about GDPR edtech compliance. It does not constitute legal advice. Privacy law varies by jurisdiction and changes frequently — consult a qualified privacy specialist for advice specific to your platform.
Top comments (0)