Building a platform where kids might be present? The regulatory landscape changed substantially in 2024 and 2025, and the compliance obligations are more specific than many developers realize.
This is a practical breakdown of what the EU Digital Services Act and UK Online Safety Act actually require at the technical level, and what compliant infrastructure looks like.
Are you in scope?
The EU Digital Services Act's child safety obligations (Article 28) apply to any online platform accessible to minors in the EU. "Accessible to minors" is the operative phrase: if children can access your service, you are in scope. You do not have to specifically market to children. The DSA came into full application in February 2024.
The UK Online Safety Act takes a similar approach: services "likely to be accessed by children" in the UK fall under child safety duties. Ofcom is publishing a categorization register in mid-2026 that will explicitly list which services are in scope.
The practical implication: any platform with social features, chat, or user-generated content that children might encounter is likely subject to at least some of these obligations. The "we're too small to worry about it" era is over.
What "proactive safety" actually means
Both the DSA and UK OSA require proactive rather than reactive child safety measures. This is a meaningful distinction.
Reactive safety means: a child reports something harmful, the platform reviews it and takes action. This is the baseline that most platforms operate at today.
Proactive safety means: the platform has systems in place to identify and intervene before harm occurs, based on risk assessment and systematic monitoring.
Specifically, Article 28 of the DSA requires platforms to assess systemic risks to minors and implement mitigation measures. The UK OSA requires services to be "safe by design," with proactive systems rather than purely response-based moderation.
Keyword filters, even sophisticated ones, are primarily reactive. Predators have adapted to them. They avoid flagged terms, use coded language, and spend weeks or months establishing trust before anything overtly harmful appears in message content. By the time a keyword filter triggers, the grooming process has often already advanced significantly.
What satisfies proactive requirements is behavioral monitoring: watching how interactions evolve over time, identifying escalation patterns early, and surfacing risk before explicit content appears.
The audit trail requirement
Both regulations require platforms to demonstrate compliance, which means documentation and audit trails are mandatory, not optional.
DSA Article 28 requires platforms to produce documentation of their risk assessments and mitigation measures. Regulators can demand this evidence. The record-keeping obligation extends across multiple years.
The UK Online Safety Act requires similar audit readiness. Ofcom has enforcement powers including substantial fines, and audit evidence demonstrating proactive safety measures is central to establishing compliance.
For legal proceedings involving child exploitation or grooming, courts and law enforcement also require documentation: who was flagged, what behavioral evidence supported the flag, what action was taken, and when. This documentation needs to be tamper-evident, meaning the platform cannot alter records after the fact without detection.
Cryptographically chained audit logs, retained for at least seven years, satisfy both the regulatory audit requirements and the legal evidence standards.
The mandatory reporting infrastructure
Platforms operating in the US have mandatory reporting obligations under 18 U.S.C. ยง 2258A: if a platform becomes aware of apparent child sexual exploitation material, it must report to the National Center for Missing and Exploited Children (NCMEC) CyberTipline. Failure to report is a criminal offense.
The NCMEC reporting process requires specific documentation: user information, timestamps, platform context, and the flagged content. Generating these evidence packages manually is error-prone and slow. Compliance infrastructure should automate this documentation so that when a platform files a report, the evidence package is ready.
The GDPR and COPPA intersection
Platforms serving users across jurisdictions face an intersection problem. COPPA (US) applies to platforms collecting personal information from children under 13. GDPR (EU) applies to personal data of EU residents, with heightened protections for children's data. The UK post-Brexit equivalent maintains similar protections.
These frameworks have different requirements around data retention, parental consent, and erasure. A platform operating internationally needs to satisfy all of them simultaneously. The infrastructure for this includes jurisdiction-aware data retention policies, automated erasure workflows for deletion requests, parental consent mechanisms and records, and separation of data handling for minors versus adult users.
What compliant infrastructure actually needs
Pulling this together, a platform taking its child safety compliance obligations seriously needs:
- Proactive behavioral detection that identifies escalation patterns before explicit harm occurs
- Tamper-evident audit logs retained for at least seven years, cryptographically chained so records cannot be altered
- Risk assessment documentation recording what the platform assessed and what mitigations were implemented
- NCMEC CyberTipline evidence packages generated automatically when reportable content is identified
- Jurisdiction-aware data handling covering GDPR, COPPA, and UK data protection requirements
- Explainable moderation decisions so human moderators and regulators can understand why a user was flagged
The compliance gap
This infrastructure has historically been expensive to build and only accessible to large platforms. GDPR compliance consultants, behavioral detection systems, and audit infrastructure are not cheap.
This creates a genuine problem: the largest platforms have dedicated trust and safety teams and reasonable compliance budgets. Smaller platforms, often the ones children encounter in gaming, social, and creative communities, have almost nothing.
The DSA and UK OSA apply to smaller platforms too. The July 2026 Ofcom categorization register and continued DSA enforcement will make this increasingly difficult to ignore.
SENTINEL
We built SENTINEL as an open-source reference implementation for exactly this compliance stack. It ships with behavioral detection across four signal types (linguistic, graph, temporal, fairness), a demographic parity enforcement gate that blocks deployment if the detection model disproportionately flags any group, tamper-evident cryptographically chained audit logs with seven-year default retention, automated NCMEC CyberTipline evidence package generation, and jurisdiction-aware GDPR and COPPA data handling.
Every risk score comes with a plain-language explanation of the specific behavioral signals that triggered it, so moderators and regulators can understand and document the decision.
SENTINEL is free for platforms under $100k annual revenue and all non-commercial and research use. Fully open source.
Top comments (0)