GDPR for EdTech Companies: Student Data, Parental Consent, and School Platform Compliance
EdTech platforms process children's data — the most protected category under GDPR and the UK Children's Code. Here's the compliance framework for education technology companies.
Why EdTech Is Extremely High-Risk Under GDPR
Education technology sits at the intersection of two of GDPR's most sensitive areas: children's data and special category information. When you build a platform for schools, you are almost certainly processing data about minors — some as young as four or five years old.
This matters because children cannot, under GDPR, give legally valid consent on their own for most forms of data processing. Article 8 of GDPR sets the consent threshold at 16 years (reduced to 13 in the UK). Below that age, parental or guardian consent is required for consent-based processing. And for children under 13, the ICO and EU supervisory authorities apply the strictest scrutiny.
Beyond the age threshold, schools hold some of the most sensitive data in existence: SEN records, mental health assessments, safeguarding notes, family circumstances, disciplinary histories. If your platform touches any of this, you're in special category territory with all the obligations that entails.
The regulatory environment has sharpened significantly. The UK's Age Appropriate Design Code (AADC), the EU's Digital Services Act, and a wave of enforcement actions against platforms popular with children have made child data protection a priority for regulators on both sides of the Atlantic.
The UK Children's Code (Age Appropriate Design Code)
The ICO's Children's Code — formally the Age Appropriate Design Code — came into force in September 2021 and applies to any online service "likely to be accessed by children." If your EdTech platform is used in UK schools, the Code applies to you.
The 15 standards include:
- Best interests of the child: Default settings must serve children's best interests, not commercial ones.
- Data minimisation: Collect only what is strictly necessary for the service.
- Profiling: Switch off profiling of children by default. If you must profile for educational purposes, document the legal basis carefully.
- Geolocation: Turn location tracking off by default.
- Parental controls: Provide tools for parental oversight where appropriate — but these must not undermine children's own privacy against abusive parents.
- Nudge techniques: Don't use design features that nudge children to provide more data than necessary.
- Connected toys and devices: Extend these principles to any hardware element of your service.
Non-compliance with the Children's Code is treated as a GDPR breach. Maximum fines are £17.5 million or 4% of global turnover. The ICO has already issued enforcement notices against TikTok and several other platforms.
Parental Consent: When Is It Required?
Under GDPR Article 8, processing based on consent requires parental consent for children under 16 (or 13 in the UK). But there are critical nuances:
Consent is only one legal basis. Schools frequently rely on public task (Article 6(1)(e)) or legitimate interest (Article 6(1)(f)) to process student data for core educational purposes. When a school contracts with an EdTech platform, the school is typically the data controller and the EdTech company is the processor. In this model, the school authorises the processing — and the school's own legal basis (public task, for state schools) covers the data. Separate parental consent for the EdTech platform is not always required.
When separate consent IS required: If your platform processes data for purposes beyond the core educational service — analytics shared with advertisers, behavioural profiling for commercial purposes, optional enrichment features — you cannot rely on the school's authority. You need your own legal basis, and for optional features involving under-13s, that means verified parental consent.
Consent verification is hard. GDPR requires "reasonable efforts" to verify parental consent. What constitutes reasonable effort depends on the technology and risk. For low-stakes data this might be a simple checkbox. For high-risk processing it might require ID verification. The ICO has published guidance on this — read it carefully before launching any consent-dependent feature for under-13s.
Learning Analytics and Profiling: Article 22 Risks
Learning analytics — tracking student engagement, predicting performance, identifying at-risk learners — is a core feature of many EdTech platforms. It is also one of the most legally fraught areas.
Article 22 of GDPR restricts solely automated decisions that produce legal or similarly significant effects on individuals. Automated grading systems, AI-generated learning plans, or algorithms that flag students for additional support could all fall within this scope if they affect a student's educational pathway without meaningful human review.
For EdTech platforms, this means:
- Any AI-driven recommendation that a teacher acts on without independent review may trigger Article 22.
- You must document whether your analytics involve solely automated decision-making — or whether a teacher genuinely reviews and exercises judgment before acting.
- Where Article 22 applies to children, the standard is stricter: you can only process with the child's or parent's explicit consent, or where necessary for a contract, and even then safeguards must be in place.
- Your Data Protection Impact Assessment (DPIA) must specifically address algorithmic fairness and the risk of discriminatory outcomes — a child flagged as "low potential" based on biased training data has recourse under GDPR.
Special Category Data in Schools
Schools are saturated with special category data under Article 9. Any EdTech platform integrated into the school environment may encounter:
- SEN (Special Educational Needs) records: Learning disabilities, ADHD diagnoses, autism assessments — all health data under Article 9(1)(h).
- Mental health information: Counselling records, anxiety or depression notes, safeguarding referrals.
- Family circumstances: Data processed under safeguarding duties, free school meal eligibility, or looked-after child status may reveal racial or ethnic origin, economic status, or family structure.
- Religious or philosophical beliefs: Gathered for timetabling (RE opt-outs, prayer arrangements), assemblies, or dietary needs.
If your platform ingests any data from school management systems (MIS/SIS), you almost certainly receive special category data. This requires:
- An explicit lawful basis under Article 9(2) — most commonly explicit consent or substantial public interest.
- An appropriate policy document for special category processing under UK GDPR Schedule 1 conditions.
- A DPIA if you process special category data at scale or for new purposes.
School Data Processing Agreements: What EdTech Must Provide
When you sell to schools, the school is typically the data controller and you are the data processor under Article 28 of GDPR. The school is legally required to have a written Data Processing Agreement (DPA) with you. Many EdTech companies get this wrong.
Your DPA with schools must include:
- Subject matter and duration of the processing.
- Nature and purpose of the processing.
- Type of personal data and categories of data subjects.
- Sub-processor list: Every third-party service you use that touches the data (cloud hosts, analytics tools, support platforms).
- International transfer safeguards: If any data leaves the UK/EEA, the mechanism (Standard Contractual Clauses, Adequacy Decision, etc.) must be specified.
- Security obligations: Technical and organisational measures at an appropriate level for children's data.
- Audit rights: Schools must have the right to inspect your compliance.
- Data return and deletion: What happens to data when the contract ends.
Many schools now use the UK's standardised GDPR in Schools framework and expect EdTech suppliers to comply with the Keeping Children Safe in Education (KCSIE) guidance and the ICO's Education sector guidance. Being able to demonstrate compliance with these frameworks is a commercial necessity for UK EdTech.
Teacher and Staff Data vs Student Data
Don't conflate these two categories. Teachers and staff are adults whose data you process as employees or service users of the school. Their rights and the applicable legal bases differ from student data:
- Staff data typically uses employment contract, legal obligation, or legitimate interest as its lawful basis.
- Staff are entitled to the full range of data subject rights without the age-related complications.
- TUPE (Transfer of Undertakings) rules affect staff data when schools change platforms — have a clear data transition process.
Practically: keep staff and student data architecturally separated in your system. This simplifies DPIAs, breach notifications, and deletion workflows.
LMS Platforms as Sub-Processors
If your EdTech platform integrates with or is built on top of Google Workspace for Education, Microsoft 365 Education, or other LMS platforms, those providers are your sub-processors. GDPR Article 28(4) makes you responsible for their compliance.
Key obligations:
- Contractual chain: Your DPA with schools must flow down to your sub-processors. Google's and Microsoft's education-specific DPAs are designed to support this, but you must actually sign them and reference them in your documentation.
- Sub-processor notification: If you add or change a sub-processor, schools have the right to object. Your agreements must include a notification mechanism (typically 30 days notice).
- Data location: Where does Google/Microsoft store the data? EU schools may have data residency requirements. Google and Microsoft both offer EU data boundaries for education customers — but you need to actively configure these.
- Liability: You are liable to the school for any sub-processor's failure as if it were your own failure.
Marketing to Schools: B2B Carve-Out and Its Limits
GDPR's B2B marketing carve-out (using legitimate interest for marketing to business contacts) applies to schools in limited circumstances. Schools are legal entities, and marketing to a "schools data protection officer" email address is arguably B2B. But:
- Marketing to personal named addresses of teachers or school staff requires a valid legal basis — legitimate interest can work but must withstand a balancing test.
- The Privacy and Electronic Communications Regulations (PECR) apply to electronic direct marketing regardless of GDPR. Cold email marketing to schools requires either consent or the soft opt-in (previous customer relationship).
- Marketing content directed at children, even indirectly through school accounts, must comply with the Children's Code's restrictions on nudge techniques and profiling.
Don't assume B2B means no rules. PECR enforcement in the education sector is active — the ICO has fined organisations for sending unsolicited marketing to school staff.
Student Data Portability and School Switching
Article 20 of GDPR grants data portability rights where processing is based on consent or contract and carried out by automated means. For schools switching EdTech platforms, this creates both an obligation and an opportunity:
- Students (or their parents, for under-16s) can request their data in a portable format.
- Schools themselves, as controllers, have a strong expectation of data portability when switching suppliers — even if it's not a strict Article 20 right, it's a commercial and ethical obligation.
- Build data export in machine-readable formats (JSON, CSV, XML) from day one. Being the platform that makes it easy to leave actually builds trust and wins contracts.
- Your DPA with schools should specify data return timelines (typically 30–60 days after contract termination) and data formats.
Safeguarding Data and Its Special Handling
Safeguarding data — child protection records, abuse concerns, looked-after child status — is the most sensitive data a school holds. If your platform integrates with safeguarding workflows, you are in extremely high-risk territory.
GDPR permits processing of safeguarding data under Article 9(2)(b) (substantial public interest) and the Children Act 1989/2004 framework in the UK. But this is not a blank cheque:
- Safeguarding data must be strictly access-controlled — only designated safeguarding leads and relevant staff should see it.
- Retention periods are separate from general student data and are set by statutory guidance (Keeping Children Safe in Education specifies minimum retention periods).
- Any breach of safeguarding data must be assessed not just as a GDPR matter but as a child protection risk. Your breach response procedure must include escalation to the designated safeguarding lead.
- Under no circumstances should safeguarding data be used for analytics, product improvement, or commercial purposes.
Video Lessons and Recordings: Consent and Retention
Video recordings of lessons create a distinct compliance challenge. They may contain:
- Biometric data (faces, voices — Article 9 territory if used for identification).
- Special category data disclosed during lessons (health discussions, religious content, family circumstances).
- Data about children who did not know they were being recorded.
GDPR obligations for recorded lessons:
- Consent or alternative legal basis: If recording is not a core contracted service, obtain separate consent from schools (and consider whether parental notification is required).
- Retention policy: Define and enforce a retention period. Most schools operate a 30–90 day maximum for lesson recordings unless there is a specific reason to retain longer. Do not retain recordings indefinitely.
- Access controls: Recordings should be accessible only to the teacher and enrolled students. Prevent unauthorised sharing or download.
- Third-party processing: If your video infrastructure is provided by a third party (Zoom, Whereby, custom WebRTC), that provider is a sub-processor — document it in your DPA.
US EdTech Companies and EU/UK Data Transfers
US-headquartered EdTech companies selling into EU and UK schools face a transfer problem. Post-Schrems II, transatlantic data transfers require a valid mechanism:
- EU-US Data Privacy Framework (DPF): The replacement for Privacy Shield, approved by the European Commission in July 2023. US companies can self-certify. However, Schrems III litigation is already in progress — companies relying solely on DPF should also maintain Standard Contractual Clauses as a backup.
- Standard Contractual Clauses (SCCs): The safest long-term mechanism. The 2021 EU SCCs and UK's International Data Transfer Agreements (IDTAs) are the relevant instruments.
- Transfer Impact Assessments (TIAs): Required for SCCs — you must assess whether US surveillance laws (FISA Section 702, Executive Order 12333) undermine the protection afforded by the SCCs for the specific data in question. Children's data warrants a heightened assessment.
- Data residency options: Increasingly, EU and UK schools require data to stay within the EU/UK. US EdTech companies should invest in EU data centre options rather than relying solely on transfer mechanisms.
The ICO's international transfer guidance and the EDPB's Recommendations 01/2020 are required reading before you sign a UK or EU school contract if your infrastructure is US-based.
10 Common GDPR Mistakes EdTech Companies Make
Treating the school contract as sufficient authorisation. A contract with a school is not a DPA. You need a compliant Article 28 agreement with each school client, covering all the required elements.
Not auditing sub-processors. Listing Google and Microsoft in your DPA but not auditing their sub-processing chains. The DPA requirement flows all the way down.
Using student data for product development. Analysing student behaviour to improve your product is processing for a new purpose. It requires a new legal basis, and consent from schools and potentially parents.
Indefinite data retention. No documented retention periods, or retention periods that exist on paper but aren't enforced technically. Student data must be deleted when the contract ends — and on schedule during the contract.
Ignoring the Children's Code. Assuming GDPR is enough. UK platforms likely to be accessed by children also need to comply with the 15 standards of the AADC.
No DPIA for learning analytics. Large-scale processing of children's data for profiling or analytics almost certainly requires a DPIA under Article 35. Many EdTech companies skip this.
Inadequate breach procedures. Not having a documented procedure for notifying schools of a breach within 72 hours. Remember: schools must also notify the ICO, and you are obligated to give them the information they need to do so.
Conflating teacher and student data handling. Using the same systems, retention periods, and access controls for staff and student data without considering the different legal bases and obligations.
No data export functionality. Making it difficult or impossible for schools to export their data when switching platforms. This creates legal exposure and loses contracts.
Ignoring safeguarding data classifications. Processing safeguarding notes, exclusion records, or child protection files through standard analytics pipelines. This data requires isolated, strictly access-controlled handling.
Start with a Compliance Baseline
Before you build out your full GDPR compliance programme, understand what your website and platform are already doing with visitor and user data. EdTech companies often have more third-party data flows than they realise — analytics platforms, marketing pixels, support tools — that need to be documented and disclosed in your privacy notices.
Scan your website free at app.custodia-privacy.com/scan →
Custodia scans your site for trackers, third-party scripts, and cookie consent issues in 60 seconds. No signup required. Use it as the starting point for your EdTech compliance audit.
Last updated: March 2026
Top comments (0)