The EU AI Act is often talked about as a technology problem. It isn't. It's a documentation and governance problem — and that lands squarely on your Data Protection Officer.
Here's what DPOs and compliance leads need to understand before the August 2026 enforcement deadline.
You are now responsible for AI systems, not just data
Under the EU AI Act, high-risk AI systems require a designated person accountable for compliance. If your organisation already has a DPO structure, you're likely the closest thing to that person. That means understanding risk classification, maintaining technical documentation, and being able to demonstrate conformity to regulators on demand.
Risk classification is the first hurdle — and it's harder than it sounds
The Act defines eight categories of high-risk AI (Annex III). Systems used in recruitment, credit scoring, education, critical infrastructure, law enforcement, biometric identification, and certain medical contexts are automatically high-risk. But the classification isn't always obvious. A tool your company uses for "internal HR" might qualify. A scoring model built into your CRM might qualify.
If you can't classify your AI systems, you can't know what obligations apply to you.
What high-risk classification actually requires
If a system is classified as high-risk, your obligations include:
- A conformity assessment
- Technical documentation (Article 11)
- A risk management system (Article 9)
- Data governance requirements (Article 10)
- Logging and auditability (Article 12)
- Transparency requirements for users (Article 13)
- Human oversight mechanisms (Article 14)
This is not a checkbox exercise. Regulators will expect you to demonstrate these in practice.
The documentation gap is where most organisations will fail
In practice, most companies have no centralised record of which AI systems they operate, let alone documentation that meets Article 11 requirements. DPOs who've built GDPR record-of-processing-activities (RoPA) frameworks will recognise this problem — the EU AI Act requires a similar inventory exercise, but for AI systems rather than data.
Start with an AI system inventory. Map every tool, model, or automated decision system your organisation uses or provides. Then apply the risk classification criteria.
Practical starting point
ActComply (getactcomply.com) automates the classification step and generates draft Article 11 documentation. Free to try, no account required.
The August 2026 deadline is not a soft launch. Enforcement powers are live from that date. DPOs who start now have time to do this properly. Those who wait until Q2 2026 won't.
Zac is the founder of ActComply, an EU AI Act compliance tool for technical teams and compliance professionals.
Top comments (0)