DEV Community

Custodia-Admin
Custodia-Admin

Posted on • Originally published at app.custodia-privacy.com

GDPR for Legal Tech Companies: Client Matter Data, Document Management, and Legal Professional Privilege

GDPR for Legal Tech Companies: Client Matter Data, Document Management, and Legal Professional Privilege

Date: March 27, 2026
Read time: 9 min read
Tags: GDPR, Legal Tech, SaaS


Legal tech sits at one of the most legally complex intersections in data protection: a space where GDPR obligations collide head-on with legal professional privilege, solicitor–client confidentiality, and sector-specific regulation. If you build document management platforms, matter management software, e-discovery tools, or contract review systems for law firms, you need to understand this terrain.

This guide cuts through the complexity with a practical framework.


Why Legal Tech Is a Special Case Under GDPR

Most SaaS companies process personal data as a straightforward product function — user accounts, analytics, support tickets. Legal tech platforms are different. The core data they process is someone else's client matter data: names, addresses, financial circumstances, disputes, health conditions, criminal proceedings.

That data belongs to individuals who almost certainly have no idea their information sits on your infrastructure. They engaged a law firm. The law firm uses your platform. The chain of responsibility runs through Data Processing Agreements, sub-processor clauses, and retention schedules that most legal tech companies have not fully thought through.

The ICO (Information Commissioner's Office) and other data protection authorities are increasingly paying attention to legal tech vendors following high-profile ransomware attacks on law firms — attacks that exposed hundreds of thousands of client records.


What Data Legal Tech Platforms Actually Hold

Before you can build a compliant product, you need to be honest about what you process. Legal tech platforms typically hold:

  • Client identification data — names, dates of birth, addresses, ID document numbers
  • Matter details — case type, opposing party, dispute amounts, case outcomes
  • Contract content — commercial agreements containing personal data of counterparties and signatories
  • Court documents — witness statements, affidavits, medical reports, expert evidence
  • Correspondence — emails and letters between solicitor and client, often including sensitive personal circumstances
  • Financial information — billing records, disbursement details, client account balances
  • Conflict check data — databases of existing and former clients used to check for conflicts of interest

Much of this will qualify as special category data under Article 9 GDPR — health information, criminal records, data about proceedings — which attracts a higher compliance burden and cannot be processed without an additional lawful basis beyond consent or legitimate interests.


Controller vs Processor: Where Does Your Platform Sit?

This is the question legal tech companies get wrong most often.

The law firm is the data controller. It determines why client data is collected and how it's used. The firm has the direct relationship with the data subject.

Your platform is the data processor. You process that data on the firm's instructions — storing it, indexing it, enabling search, generating documents. You do not determine the purposes of processing.

This matters enormously because:

  • As a processor, you can only process data on documented instructions from the controller
  • You cannot use client matter data for your own purposes (training ML models, product analytics) without explicit written permission
  • You must assist the controller (the law firm) in responding to data subject rights requests
  • You are directly liable under GDPR Article 82 for breaches caused by your own failures

Some legal tech platforms blur this line by offering "benchmarking" or "market intelligence" features that aggregate client data across multiple firm customers. If you do this, you are acting as a joint controller for those processing activities — which requires a joint controller agreement and additional transparency to data subjects.


Data Processing Agreements with Law Firm Clients

Every law firm using your platform must sign a DPA. Under Article 28 GDPR, this is not optional — it is a legal requirement. Without one, both you and the firm are in breach.

Your DPA must cover:

  • The subject matter, duration, nature, and purpose of the processing
  • The type of personal data and categories of data subjects
  • Your obligations and rights as processor
  • Restrictions on sub-processing (see the section on electronic signature platforms below)
  • Security measures (Article 32)
  • Breach notification timelines (you must notify the firm within 72 hours of becoming aware of a breach so the firm can meet its own ICO notification deadline)
  • Post-termination deletion or return of data

Law firms are increasingly sophisticated about DPAs. Mid-size and larger firms will send you their DPA template rather than accepting yours. Have your legal terms reviewed by a data protection solicitor before you start selling into the enterprise legal market.


Legal Professional Privilege and Subject Access Requests

Here is where things get genuinely complicated.

Under GDPR Article 15, data subjects have the right to request a copy of all personal data held about them. For individuals who have been party to legal proceedings, this could include correspondence, witness statements, and privileged legal advice held on your platform.

Can law firms use legal professional privilege (LPP) to refuse a subject access request?

Yes — but narrowly. LPP is recognised as an exemption under Schedule 2 of the UK Data Protection Act 2018. The exemption applies to the content of confidential communications between solicitor and client made for the purpose of legal advice or litigation. It does not exempt all data about the individual — basic identification data and contact information cannot be withheld on LPP grounds.

Your platform's role: You must be able to extract data about specific individuals at the firm's request without inadvertently providing access to other clients' privileged material. Your search and export functionality needs to support granular, per-individual data extraction with appropriate access controls.


Conflict Checking Systems and Personal Data

Conflict checking databases are a legal tech product category with a specific GDPR problem: they maintain records of former clients, opposing parties, and related individuals — often indefinitely — for the purpose of identifying conflicts of interest.

This data is processed under legitimate interests (Article 6(1)(f)) — the firm's legitimate interest in complying with its professional obligations under the SRA Standards and Regulations. But legitimate interests processing still requires:

  • A legitimate interests assessment (LIA) to document the balancing test
  • Retention periods proportionate to the risk of a conflict arising (not "forever")
  • A privacy notice that informs individuals their data may be held for conflict checking purposes

If your platform provides centralised conflict checking across multiple firms — a network approach — you are likely acting as a controller for the conflict database itself, not a processor. This changes your obligations significantly.


E-Discovery, Litigation Hold, and GDPR Deletion Rights

Article 17 GDPR gives individuals the right to erasure. Article 18 gives them the right to restrict processing. Both rights create direct friction with litigation hold obligations.

When a client (or opposing party) requests deletion of their personal data, and that data is subject to a litigation hold, the competing obligations must be resolved. UK law provides the answer: Schedule 2, paragraph 5 of the DPA 2018 exempts processing that is necessary for the establishment, exercise, or defence of legal claims from data subject rights including erasure and restriction.

In practice, this means your platform should support litigation hold flags that temporarily suspend automated deletion rules for specific matters or specific data sets. When the hold is lifted (because litigation concludes or a limitation period expires), normal retention schedules should resume.

Build this into your product architecture. Law firms using your platform need to evidence that holds are in place, applied consistently, and lifted appropriately. This is an audit function, not just a toggle.


AI Contract Review and Automated Decision-Making

AI contract review tools face Article 22 GDPR where decisions with legal or significant effect are made solely by automated means.

Most legal AI tools are assistive rather than decisional — a human solicitor reviews the AI's output and makes the final call. Provided that human review is genuine and not perfunctory, Article 22 does not apply.

But watch for two scenarios where it might:

  1. Automated risk scoring of contracts that gates whether a transaction proceeds without solicitor review
  2. Automated conflict checking that rejects instructions based on algorithmic output without human review

If your product does either of these, you need to provide data subjects with the right to request human review, explain the logic of the automated decision, and contest the outcome.

Your privacy documentation should clearly state whether your AI features involve automated decision-making, the logic involved, and the significance for data subjects.


Electronic Signature Platforms as Sub-Processors

If your document management platform integrates with DocuSign, Adobe Sign, or similar electronic signature services, those providers are your sub-processors under Article 28(4) GDPR.

You need:

  • Written authorisation from your law firm clients to engage those sub-processors (usually covered by a general authorisation clause in your DPA with a list of approved sub-processors)
  • Your own DPA with each sub-processor imposing equivalent obligations
  • A change notification mechanism — you must give firms advance notice when you add or replace sub-processors so they can object

Check where your e-signature providers process data. Some route through US data centres. This triggers the international data transfer requirements discussed below.


Matter Archiving, Retention, and SRA Obligations

Law firms operate under SRA guidance that implies certain records should be kept for extended periods — files for residential conveyancing have traditionally been retained for 12–15 years, for example. This creates apparent tension with GDPR's storage limitation principle.

The tension is manageable:

  • GDPR does not prohibit retention for professional obligation purposes — Article 5(1)(e) explicitly allows retention for longer periods where necessary for archiving, legal claims, or compliance with legal obligations
  • The SRA's own guidance acknowledges that firms must balance their professional obligations with data protection requirements
  • "Archiving" under GDPR means the data must be genuinely inaccessible for routine operational use — archived matters should be isolated from live systems, with access requiring explicit justification

Your platform should support tiered retention policies: active matter, closed matter (accessible), and archived matter (isolated, accessible only by request). Automated deletion schedules at the end of the archived tier complete the lifecycle.


International Data Transfers in Cross-Border Legal Matters

Legal tech platforms operating in the UK and EU regularly face cross-border data transfer issues:

  • US-headquartered platforms processing UK/EU client matter data must rely on UK-US Data Bridge or Standard Contractual Clauses (SCCs) for UK-originating transfers
  • Platforms with EU law firm customers and US infrastructure need Article 46 transfer mechanisms for EU-originating data
  • Cross-border legal matters involving parties in multiple jurisdictions do not in themselves create a legal basis to transfer personal data internationally — you still need a transfer mechanism

For law firms conducting US litigation (discovery involving UK individuals), there is specific guidance on navigating the conflict between US discovery orders and GDPR. Your platform documentation should acknowledge this tension and direct firms to appropriate legal advice.


Security Requirements for Legal Tech

Legal tech platforms face layered security expectations:

GDPR Article 32 requires appropriate technical and organisational security measures — encryption, access controls, pseudonymisation where appropriate, regular testing.

ISO 27001 certification is increasingly a prerequisite for enterprise law firm procurement. Larger firms run third-party supplier due diligence processes that include security questionnaires and sometimes full audits.

Cyber Essentials (or Cyber Essentials Plus) is the UK government's baseline standard and is required for any supplier to UK public sector bodies including the Crown Prosecution Service, HMRC, and NHS.

SRA Cybersecurity Guidance (2023) sets expectations for law firms on vendor management. Firms are expected to assess the security posture of technology suppliers handling client data.

If you are not yet ISO 27001 certified, publish your security whitepaper, make your penetration test summary available under NDA, and articulate your progress toward certification. Procurement teams at law firms will ask.


Biometric Authentication and Secure Document Access

Biometric authentication for document access (fingerprint, Face ID) involves processing biometric data, which is special category data under Article 9(1) GDPR.

This means you need:

  • An explicit lawful basis under Article 9(2) — most platforms rely on Article 9(2)(a) explicit consent or Article 9(2)(f) legal claims
  • Documented retention periods for biometric templates (should be as short as technically feasible)
  • A clear alternative authentication method for users who do not consent to biometric processing

Do not roll out biometric features without a Data Protection Impact Assessment (DPIA). The combination of biometric data and highly sensitive legal matter data is exactly the kind of high-risk processing that triggers a mandatory DPIA under Article 35.


10 Common GDPR Mistakes Legal Tech Companies Make

  1. No DPA with law firm customers. Every customer relationship must be covered. This is a legal requirement, not optional.

  2. Using client matter data to train AI models without explicit permission. This is processing for a new purpose. It requires a fresh lawful basis and almost certainly means you are acting as a controller, not a processor, for that processing activity.

  3. No sub-processor list. Firms cannot authorise what they do not know about. Maintain and publish a current list of all sub-processors.

  4. Breach notification gaps. Your obligation to notify the firm runs from when you become aware, not when you complete your investigation. Alert first, investigate in parallel.

  5. Retention schedules not enforced. Most platforms have retention policies in their documentation that do not actually run as automated deletion rules in the product. Regulators expect documentation to match practice.

  6. Conflict database retained indefinitely. Conflict data has a purpose and a proportionate retention period. "We keep it forever because we might need it" is not a position that survives regulatory scrutiny.

  7. DPIA skipped for high-risk features. New features processing special category data, biometrics, or data at large scale require a DPIA before launch. This is not a post-launch exercise.

  8. No documented international transfer mechanism. If your infrastructure is US-based and you process UK or EU client data, document your transfer mechanism explicitly in your DPA and privacy notice.

  9. Subject access request process not tested. Can you extract all data held about a specific individual across all your systems in a reasonable timeframe? Test this before a firm asks you to support a SAR.

  10. Treating LPP as a blanket exemption. Legal professional privilege does not exempt all processing about an individual — it exempts privileged communications. Overclaiming this exemption increases the risk of ICO enforcement.


Run a Compliance Baseline on Your Platform

Privacy compliance in legal tech starts with understanding what your platform and marketing properties are actually doing with visitor and user data. Before you approach law firm procurement teams or enterprise security reviewers, make sure your own front-end compliance is clean.

Scan your website free at https://app.custodia-privacy.com/scan — identify trackers, missing consent mechanisms, and third-party data flows in 60 seconds, no signup required.


Last updated: March 2026

Top comments (0)