DEV Community

Custodia-Admin
Custodia-Admin

Posted on • Originally published at app.custodia-privacy.com

GDPR and IoT Devices: Privacy Obligations for Connected Product Manufacturers

Smart home sensors that log when you leave for work. Fitness trackers that infer your menstrual cycle from heart rate data. Connected cars that build a map of everywhere you drive. Smart plugs that reveal home occupancy patterns precise enough to determine when a property is empty. Industrial sensors that monitor individual workers' movements and productivity.

The Internet of Things is, in many ways, the largest personal data collection apparatus ever built — operating continuously, invisibly, and in the most intimate spaces of people's lives. For manufacturers and product companies building these devices, GDPR creates a set of obligations that differ meaningfully from those facing web businesses. The stakes are higher, the technical constraints are different, and the regulatory expectations have evolved significantly.

This guide covers what GDPR actually requires from connected product manufacturers — from device design through data sharing, firmware updates, and the intersecting requirements of the EU Cyber Resilience Act.


What IoT Devices Collect — and Why It's Personal Data

The first compliance question for any IoT manufacturer is deceptively simple: what does our device collect, and does any of it constitute personal data under GDPR?

The answer is almost always yes — and often more extensively than initially apparent.

Location data is an obvious example. GPS-enabled devices — cars, fitness trackers, asset trackers — generate continuous location histories that are, by definition, personal data. But location can be inferred from devices that don't have GPS. Smart home devices that connect to a home Wi-Fi network reveal the home address. Smart meters reveal not just energy consumption but, when analysed at sufficient granularity, occupancy patterns, appliance usage, and daily routines.

Behavioural data emerges from almost any IoT device. A smart plug records when specific appliances are switched on and off. A smart lock records entry and exit times. A connected coffee machine records when someone wakes up. Individually, each data point might seem trivial. Aggregated over weeks and months, these streams build a detailed picture of an individual's daily life.

Health inferences are particularly sensitive. Smart scales don't just measure weight — they calculate BMI, track trends, and increasingly infer metabolic health indicators. Fitness trackers measure heart rate variability, sleep stages, stress levels, and activity patterns. These devices generate health data that qualifies as special category data under GDPR Article 9, triggering significantly stricter processing requirements.

Voice data and ambient audio collected by voice assistants represents some of the most intimate data an IoT device can capture. Even devices that are theoretically "always listening only for the wake word" create significant privacy exposure — both because of the volume of audio inevitably captured and because recordings are often processed remotely.

Industrial IoT raises workplace privacy considerations. Sensors tracking individual worker movements, productivity metrics, or biometric data on a manufacturing floor are processing personal data, and the power imbalance between employer and employee means consent is rarely a valid legal basis.


Who Is the Data Controller?

For web services, the data controller question is usually straightforward. For IoT products, it can be genuinely complex — and getting it wrong has direct compliance consequences.

Consider a typical smart home device: the device itself is manufactured by one company, the companion app is operated by another (or by the same company's software division), the cloud backend where data is processed and stored may be a third entity, and the device may integrate with third-party services like Google Home, Amazon Alexa, or Apple HomeKit.

Under GDPR, the data controller is the entity that determines the purposes and means of processing personal data. Where two or more parties jointly determine purposes and means, they are joint controllers under Article 26, with specific obligations around transparency and inter-controller arrangements.

In practice:

  • If the manufacturer controls both the device firmware and the cloud backend, they are likely the sole controller for core processing activities.
  • If the manufacturer's app sends data to a third-party analytics platform, that platform is a data processor — and the manufacturer needs a valid data processing agreement.
  • If the device integrates with a smart home ecosystem like Google Home, there may be joint controller arrangements to document and disclose.
  • If a white-label device is sold under another brand's name, the question of who controls the data requires careful legal analysis of the actual data flows.

Manufacturers often underestimate their controller responsibilities when they sell through retailers or license their technology to other brands. The data subject — the person using the device — will hold someone accountable when things go wrong. GDPR requires that accountability to be clearly allocated before that happens.


Purpose Limitation in AI-Enhanced IoT

One of the most significant compliance challenges for modern IoT products is purpose limitation. GDPR Article 5(1)(b) requires that personal data be collected for specified, explicit, and legitimate purposes and not further processed in a manner incompatible with those purposes.

The problem: IoT devices increasingly use machine learning to find secondary uses for the data they collect. A fitness tracker sold for activity monitoring might, in later firmware versions, add stress detection, sleep coaching, or fertility tracking. A smart home hub sold for energy management might add occupancy prediction. A connected car's telematics system might be repurposed for insurance scoring.

Each of these secondary uses potentially requires a new legal basis. If the original consent was for fitness tracking, it may not cover health inferences. If the original legitimate interest assessment covered smart home automation, it may not cover marketing analytics derived from occupancy data.

Before launching secondary processing use cases, manufacturers need to ask:

  1. Is the new purpose compatible with the original purpose (applying the Article 6(4) compatibility test)?
  2. If not, what is the legal basis for the new processing?
  3. If the new basis is consent, do users need to be re-asked — and what happens if they decline?
  4. Have we updated our privacy notice to disclose the new processing?

This has real product consequences. You cannot simply ship a firmware update that adds new data processing capabilities and treat existing users as having consented by virtue of not opting out.


Firmware Updates and Consent

This brings us to one of the most practically challenging questions in IoT privacy compliance: can you update processing purposes through a firmware update?

The short answer is no — not without a valid legal basis for the new processing, appropriate disclosure, and, where consent is required, fresh consent.

When a manufacturer ships a firmware update that changes what data is collected or how it is used, they are potentially changing the purpose of processing. Users who consented to the original processing have not consented to the new processing. Legitimate interest claims for the new processing require a fresh balancing test.

Best practice for manufacturers:

  • Version your privacy notices — each major release that changes data processing should be accompanied by an updated privacy notice, communicated to users via the app or email.
  • Distinguish between functional and data-processing changes — a firmware update that fixes a security vulnerability does not require fresh consent. A firmware update that adds a new analytics feature does.
  • Build consent infrastructure into the companion app — make it possible to present users with new consent requests when they open the app after a major update.
  • Respect opt-outs — if a user declines new processing, the device should continue to function for its original purpose without the new data collection.

Voice Assistants and Ambient Data Collection

Voice-activated IoT devices — smart speakers, connected TVs, in-car voice systems — present unique GDPR challenges around ambient data collection.

The technical reality is that distinguishing "wake word detection" from "ambient listening" is difficult to explain to consumers and difficult to prove to regulators. Supervisory authorities across Europe have taken an increasingly sceptical view of claims that voice assistants only process audio after the wake word is detected, given the processing required to detect the wake word in the first place.

Key obligations for voice assistant manufacturers:

Transparency: Privacy notices must clearly explain what audio is captured, how long recordings are retained, whether recordings are reviewed by humans (even on an anonymised basis), and how users can request deletion.

Purpose limitation: Audio captured for wake word detection cannot be retained and used for advertising profiling without a separate legal basis. Multiple regulators have found this practice unlawful.

Access and deletion: Users have a right to access recordings made by their devices and to request deletion. Manufacturers need functional mechanisms for both.

Children: If a voice assistant is in a home with children (and they almost always are), the manufacturer should have considered the likelihood that the device captures children's voices and addressed this in their DPIA.


Smart Home Data and Article 9: Health Inferences

Article 9 of GDPR provides heightened protection for "special categories" of personal data, including health data, data revealing racial or ethnic origin, and data concerning sex life or sexual orientation.

IoT devices can generate health data — or data from which health information can be inferred — without ever explicitly being health devices. This is one of the most legally unsettled areas of IoT privacy.

A smart scale that measures weight is arguably not processing health data in the Article 9 sense — weight is not inherently a health indicator. But a smart scale that tracks weight trends over time, calculates BMI, and correlates readings with activity data from a paired fitness tracker starts to cross into health data territory. A fitness tracker that detects irregular heart rhythms is plainly processing health data.

Home occupancy data from smart plugs or smart meters may reveal health information indirectly — irregular patterns might indicate illness, disability, or care needs. Supervisory authorities have not definitively ruled on when occupancy data becomes health data, but the risk is real.

For manufacturers of devices that collect or can infer health data:

  • Conduct a DPIA before launch — Article 35 makes this mandatory for large-scale processing of special category data.
  • Identify the correct legal basis under Article 9(2) — explicit consent is the most common, but explicit consent has a higher bar than standard consent.
  • Implement data minimisation at the device level — if you do not need raw sensor data to deliver the feature, do not send it to the cloud.
  • Restrict access to health data within your organisation — Article 9 processing should be limited to those who need it to deliver the service.

Children's IoT Devices

Smart toys, educational tablets, children's fitness trackers, and connected learning devices occupy a particularly sensitive regulatory position.

GDPR Article 8 sets the age of digital consent at 16 (member states can lower this to 13), meaning that consent from a child below this threshold requires parental authorisation. But beyond consent mechanics, the broader principle is that children deserve enhanced protection — the ICO's Age Appropriate Design Code (Children's Code) in the UK, and equivalent frameworks emerging across the EU, place additional obligations on services directed at children.

For children's IoT manufacturers:

Verify age and parental consent. Simply asking users to confirm they are over 16 (or whatever the local threshold is) is not sufficient for services clearly directed at children. Robust age verification and parental consent mechanisms are expected.

Data minimisation is not optional. Collect the minimum data necessary to deliver the product's core features. Do not build analytics or profiling capabilities based on children's data.

Default to privacy-protective settings. All data processing beyond the core product function should be off by default for devices used by children.

No behavioural advertising. Do not use children's data for advertising profiling — this is explicitly problematic under the Children's Code and increasingly under GDPR interpretation.

Location data requires particular care. Smart toys and children's devices that track location create obvious safeguarding risks if that location data is compromised. Apply strict security controls and avoid retaining precise location histories beyond what is technically necessary.


Data Minimisation in Device Design

Privacy by design, required under GDPR Article 25, means integrating privacy protections into the technical design of a product — not bolting them on afterward.

For IoT manufacturers, this translates into concrete hardware and firmware decisions:

Can processing happen on-device? Voice recognition, activity classification, and many sensor processing tasks can be performed locally, on the device itself, without sending raw data to the cloud. Edge processing is both a performance advantage and a privacy advantage — data that never leaves the device cannot be breached in transit.

Can you collect less? A fitness tracker that needs to detect step counts does not need to retain raw accelerometer data — it can calculate steps locally and discard the underlying data. A smart meter that needs to enable time-of-use billing does not need to retain minute-by-minute consumption data forever.

Can you aggregate before transmitting? Instead of streaming raw sensor data to a cloud backend, aggregate into hourly or daily summaries. This delivers the user value while significantly reducing the personal data footprint.

Can you design out certain data categories? If your device could theoretically collect voice data but your product does not require voice functionality, do not include a microphone. This "privacy by hardware design" approach is increasingly advocated by data protection authorities.


Data Sharing with Third-Party Integrated Services

IoT devices rarely operate in isolation. Most companion apps integrate with cloud services — IFTTT, Amazon Alexa, Google Home, Apple HomeKit, Spotify, weather services, delivery notifications. Each integration is a potential data sharing relationship that requires GDPR compliance.

Processor vs. controller: Where your device sends data to a third-party service that processes it on your behalf (e.g., a cloud analytics provider), that third party is a data processor and requires a Data Processing Agreement under Article 28. Where the third-party determines its own purposes for the data (e.g., an advertising platform), it becomes an independent controller — and you need to disclose this to users.

International transfers: Many IoT cloud services involve transferring data to the US or other third countries. Post-Schrems II and the EU-US Data Privacy Framework, this requires either an adequacy decision, Standard Contractual Clauses, or another valid transfer mechanism. For every third-party integration, manufacturers need to verify the transfer basis.

Disclosure in privacy notices: Every third party that receives data from your device — and the nature of that sharing — must be disclosed to users. Generic "we may share with partners" language does not satisfy GDPR's transparency requirements.


Security Requirements: Encryption, Secure Updates, and Beyond

GDPR Article 32 requires data controllers to implement appropriate technical and organisational measures to ensure security appropriate to the risk. For IoT devices, this has specific implications.

Encryption at rest: Personal data stored on the device or in the cloud backend should be encrypted. For devices that store data locally (smart cameras, wearables with onboard storage), this means hardware-level encryption.

Encryption in transit: All communication between the device, the companion app, and the cloud backend should use TLS 1.2 or higher. Unencrypted IoT communications are a well-documented attack vector.

Secure boot and firmware integrity: Devices should verify the integrity of firmware before executing it. Compromised firmware that exfiltrates user data is a GDPR breach — and a product liability issue.

Secure update mechanisms: Firmware updates should be cryptographically signed. Update channels should be protected against man-in-the-middle attacks. The update infrastructure itself needs security hardening.

Access controls: Cloud backends processing personal data from IoT devices should implement role-based access controls, multi-factor authentication for privileged access, and audit logging.

Data breach response: Manufacturers need incident response procedures that can detect a device-level or backend compromise, notify users and supervisory authorities within the GDPR's 72-hour window, and isolate affected devices if possible.


The EU Cyber Resilience Act and Its Intersection with GDPR

The EU Cyber Resilience Act (CRA), which entered into force in late 2024 with compliance obligations phasing in through 2027, adds a layer of security regulation that intersects significantly with GDPR for IoT manufacturers.

The CRA imposes mandatory cybersecurity requirements on products with digital elements — which covers virtually every connected device. Key requirements include:

  • Products must be designed and manufactured with security by default.
  • Manufacturers must provide security updates for the expected product lifetime (minimum five years for most consumer IoT).
  • Known vulnerabilities must be addressed and updates made available without delay.
  • Manufacturers must report actively exploited vulnerabilities to ENISA within 24 hours.
  • A Software Bill of Materials (SBOM) must be maintained for products.

The overlap with GDPR is significant. A device that fails CRA security requirements — say, one that lacks encrypted storage or secure update mechanisms — is also likely failing GDPR Article 32. A vulnerability that is exploited and leads to personal data exfiltration triggers both CRA notification requirements (to ENISA) and GDPR breach notification requirements (to supervisory authorities and potentially to data subjects).

For IoT manufacturers, the practical consequence is that security compliance needs to be planned together — CRA and GDPR have complementary requirements that can be addressed through a unified technical programme rather than two separate compliance exercises.


A Practical Checklist for IoT Manufacturers

Before launch:

  • Complete a DPIA for all personal data processing (mandatory for special category data and large-scale processing)
  • Document every data flow: device to app, app to cloud, cloud to third parties
  • Establish controller/processor roles for all third-party relationships
  • Implement Data Processing Agreements with all processors
  • Design privacy notice to be accessible in the companion app — not just on a website
  • Implement privacy by design: edge processing where possible, data minimisation at source, encryption at rest and in transit
  • Age-gate consent properly for products that may be used by children

Ongoing obligations:

  • Maintain a Record of Processing Activities (Article 30)
  • Version privacy notices with each processing change
  • Obtain fresh consent or establish new legal basis before expanding data use
  • Respond to access, deletion, and portability requests within one month
  • Maintain 72-hour breach notification capability
  • Ship security updates for the product's supported lifetime

Third-party integrations:

  • Audit all third-party services for their role (processor or independent controller)
  • Verify transfer mechanisms for cross-border data flows
  • Disclose all third-party sharing in the privacy notice

Where to Start

If you are building or operating connected products and want to understand your current privacy exposure, the most useful first step is often a scan of your companion app's website and backend for visible compliance gaps.

Run a free scan at app.custodia-privacy.com/scan — it takes 60 seconds, requires no signup, and identifies trackers, consent issues, and privacy policy gaps on your public-facing web presence. It will not scan your device firmware or cloud backend, but it will give you a concrete starting point for the compliance work ahead.

The device-level and backend compliance work — DPIAs, data flow mapping, security architecture — requires deeper engagement. But understanding where your public-facing presence stands is a useful and quick first step.


Last updated: March 27, 2026. This post provides general information about GDPR obligations for IoT and connected product manufacturers. It does not constitute legal advice. Regulatory requirements vary by jurisdiction, product category, and specific data processing activities — consult a qualified privacy and data protection professional for advice tailored to your products and circumstances.

Top comments (0)