EU Data Act 2026: The 'Data by Design' Obligations Every SaaS and IoT Developer Must Know
Post #667 in the sota.io EU Compliance Series
The EU Data Act (Regulation (EU) 2023/2854) has been legally binding since September 12, 2025. Most legal teams know the headline — users have a right to access their data from connected products. Fewer engineering teams have translated that into actual product and infrastructure requirements.
In 2026, that gap is closing. German and French market surveillance and data protection authorities are actively ramping up enforcement. France's CNIL and Germany's Bundesnetzagentur have both signalled that Data Act compliance is a 2026 priority. If your product touches connected hardware, machine-generated data, or cloud service switching, enforcement actions are no longer hypothetical.
This guide covers what the Data Act actually requires developers to build, the technical implementation pattern, and why the infrastructure your data pipelines run on matters for compliance.
What the EU Data Act Is (and Is Not)
Before diving into Articles 3-6, a framing that prevents the most common misreading:
The EU Data Act is not GDPR. GDPR governs personal data. The Data Act governs data generated by connected products and related services — including machine data, sensor data, and operational data — whether or not it is personal. The two regimes overlap in the real world (most IoT data has a personal dimension) but they have different legal bases, different obligations, and different enforcement bodies.
The EU Data Act is not the Data Governance Act. The Data Governance Act (Regulation 2022/868) governs data intermediaries and public sector data reuse. The Data Act governs private sector data sharing between businesses and users.
The Data Act's Chapter II (Articles 3-6) is the core obligation for product and SaaS developers: mandatory data access by design for connected products and related services.
The September 12, 2025 Application Date
The Data Act was published in the Official Journal on December 22, 2023. It entered into force on January 11, 2024 (20th day after publication per Art. 68). Article 50 specifies that the Regulation applies 20 months after entry into force — which is September 12, 2025.
If your product:
- Is a connected product (hardware with a digital component that generates data)
- Or a related service (software that processes data generated by such hardware)
- Or a cloud service subject to the switching and interoperability obligations (Chapter VI)
...then these obligations applied to you from September 2025. Not September 2026. The enforcement ramp is what accelerates in 2026.
The distinction matters for two reasons. First, you are already in scope. Second, when an enforcement action starts, the relevant period for infringement assessment begins from September 12, 2025, not from when authorities publicly announce their focus.
Article 3: Data by Design — The Core Obligation
Article 3 establishes the foundational requirement: connected products must be designed to make data generated by their use accessible by default.
The full text of Art. 3(1):
"Data holders shall make available to users any data generated by the use of a connected product or a related service, promptly, easily, securely, free of charge and, where applicable, continuously and in real-time."
Four dimensions to decompose:
"Promptly, easily, securely" — this is a design constraint, not just a legal requirement. It means the accessibility mechanism must be built in from the beginning. Bolting on a data export button as an afterthought fails the "easily" test. The obligation is comparable to GDPR's privacy-by-design (Art. 25) but applied to data accessibility.
"Free of charge" — the user's right to their own data cannot be monetised. You cannot charge a user to access the data their device generates. You can charge for value-added processing of that data, but the raw access right is free.
"Continuously and in real-time" — for products where this is technically feasible (most networked IoT), the data must be available not just in batch exports but via continuous real-time access. A thermostat that logs temperature every 60 seconds must make that stream available to the user, not just periodic snapshots.
"Any data generated" — this is broad. It covers operational data, sensor data, usage logs, and metadata generated during product use. It does not cover data the manufacturer independently creates about the product (e.g., internal diagnostics the user does not generate through use).
Article 4: The User's Right to Share Data with Third Parties
Article 4 extends the Article 3 right: users can instruct the data holder to share their data with a third party of the user's choice.
This creates a portability obligation with third-party forwarding. Unlike GDPR Article 20 (data portability), which allows users to receive a copy, Data Act Article 4 requires the data holder to forward data directly to the third-party recipient in a machine-readable, structured format.
The practical implication: your connected product or SaaS must implement not just a user-facing data download, but a B2C-to-B2B data transmission pipeline. The user authorises, and your system forwards data to an API endpoint or service the user designates.
# Article 4 data forwarding implementation pattern
from dataclasses import dataclass
from typing import Optional
import httpx
import json
from datetime import datetime, timezone
@dataclass
class DataForwardingAuthorisation:
user_id: str
third_party_endpoint: str
data_categories: list[str]
authorised_at: datetime
expires_at: Optional[datetime]
revoked: bool = False
class DataActForwarder:
"""
Implements EU Data Act Art. 4 data forwarding obligations.
Users can instruct data holders to forward their data to
third parties. This class handles the forwarding pipeline.
"""
def __init__(self, db_client, audit_log):
self.db = db_client
self.audit = audit_log
async def create_forwarding_authorisation(
self,
user_id: str,
third_party_endpoint: str,
data_categories: list[str],
expires_days: Optional[int] = None,
) -> DataForwardingAuthorisation:
expires_at = None
if expires_days:
from datetime import timedelta
expires_at = datetime.now(timezone.utc) + timedelta(days=expires_days)
auth = DataForwardingAuthorisation(
user_id=user_id,
third_party_endpoint=third_party_endpoint,
data_categories=data_categories,
authorised_at=datetime.now(timezone.utc),
expires_at=expires_at,
)
await self.db.save_authorisation(auth)
await self.audit.log("data_act_art4_authorisation_created", {
"user_id": user_id,
"endpoint": third_party_endpoint,
"categories": data_categories,
})
return auth
async def forward_data_batch(
self,
authorisation: DataForwardingAuthorisation,
data_batch: list[dict],
) -> bool:
if authorisation.revoked:
return False
if authorisation.expires_at and datetime.now(timezone.utc) > authorisation.expires_at:
return False
payload = {
"data_act_version": "2023/2854",
"article": "4",
"user_reference": authorisation.user_id,
"forwarded_at": datetime.now(timezone.utc).isoformat(),
"data_categories": authorisation.data_categories,
"records": data_batch,
}
async with httpx.AsyncClient(timeout=30.0) as client:
response = await client.post(
authorisation.third_party_endpoint,
json=payload,
headers={"Content-Type": "application/json"},
)
success = response.status_code in (200, 201, 202)
await self.audit.log("data_act_art4_forwarding_attempt", {
"user_id": authorisation.user_id,
"endpoint": authorisation.third_party_endpoint,
"record_count": len(data_batch),
"success": success,
})
return success
async def revoke_authorisation(self, user_id: str, authorisation_id: str) -> None:
await self.db.revoke_authorisation(authorisation_id)
await self.audit.log("data_act_art4_authorisation_revoked", {
"user_id": user_id,
"authorisation_id": authorisation_id,
})
The third-party receiving the forwarded data becomes a data recipient under the Data Act and takes on its own obligations under Article 6 (use restrictions).
Article 5: Contractual Data Sharing with Third Parties
Article 5 governs the contractual framework for B2B data sharing. If a business user (not a consumer) wants to access data generated by a connected product they own or operate, the data holder must provide access on fair, reasonable, and non-discriminatory (FRAND) terms.
The Article 5 obligation differs from Articles 3-4 in one critical way: it is subject to trade secret protection. Data holders can refuse access to data that would constitute trade secret disclosure — but they must demonstrate this, and the refusal must be proportionate.
For SaaS developers, Article 5 creates a contract template requirement: if enterprise customers operate your product and generate data through it, you need a data access agreement that meets the FRAND standard. Your standard SaaS terms likely do not cover this adequately.
Minimum Article 5 contract elements:
- Defined data categories accessible to the business user
- Access method (API, export format, streaming endpoint)
- Service level for data delivery
- Permitted use cases for the accessed data
- No exclusivity on the data (Art. 5(7))
- No barriers to switching (connection to Chapter VI interoperability)
Article 6: Use Restriction for Third Parties
Article 6 closes the loop on what third parties receiving data via Article 4 can do with it. The restrictions are strict:
- Data can only be used for the purpose for which access was provided
- No profiling of natural persons beyond what is strictly necessary for the agreed purpose
- No selling or monetising the data
- No combining with data from other sources except as expressly permitted
Article 6 also contains a notable prohibition: third parties receiving data via Article 4 cannot use it to develop a competing product. If a user authorises their smart home hub manufacturer to share usage data with a third-party integration platform, that platform cannot use the data to build a competing hub.
This creates a data use governance requirement. If you are a third party receiving Data Act Article 4 forwarded data, your data engineering pipelines need technical controls preventing prohibited uses — not just policy controls.
# Article 6 use restriction enforcement pattern
from enum import Enum
from typing import Callable
import functools
class DataActPurpose(Enum):
USER_SERVICE = "user_service"
INTEGRATION = "integration"
ANALYTICS_AGGREGATED = "analytics_aggregated"
# Prohibited purposes — for documentation/audit trail only
COMPETING_PRODUCT = "competing_product" # Art. 6(2)(e) prohibition
PROFILING = "profiling" # Art. 6(2)(b) restriction
RESALE = "resale" # Art. 6(2)(c) prohibition
PERMITTED_PURPOSES = {
DataActPurpose.USER_SERVICE,
DataActPurpose.INTEGRATION,
DataActPurpose.ANALYTICS_AGGREGATED,
}
def data_act_purpose_check(purpose: DataActPurpose):
"""Decorator to enforce Article 6 use restrictions at the function level."""
def decorator(func: Callable):
@functools.wraps(func)
async def wrapper(*args, **kwargs):
if purpose not in PERMITTED_PURPOSES:
raise PermissionError(
f"EU Data Act Art.6: Purpose '{purpose.value}' is prohibited "
f"for data received via Article 4 forwarding."
)
return await func(*args, **kwargs)
return wrapper
return decorator
@data_act_purpose_check(DataActPurpose.USER_SERVICE)
async def process_forwarded_sensor_data(user_id: str, sensor_records: list[dict]):
# Permitted: processing for direct user service delivery
pass
@data_act_purpose_check(DataActPurpose.RESALE) # Will raise PermissionError
async def export_data_to_broker(data: list[dict]):
# This function can never execute with Article 4 data
pass
The Data by Design Principle: What It Means to Build
"Data by Design" as a concept parallels GDPR's Privacy by Design (Art. 25). Where Privacy by Design means building privacy protections into your product architecture from day one, Data by Design means building data accessibility into your product architecture from day one.
The failure mode for both is the same: treating the requirement as a compliance checkbox added late in development. An export button that generates a ZIP file is not Data by Design. A continuous API stream with granular access controls, user-managed authorisations, and audit logging is.
The four architectural requirements for Data by Design compliance:
1. Real-time Data Access API
For connected products generating continuous data (sensors, meters, trackers, monitoring devices), you need an API that serves that data stream to authorised users in real-time, not just batch exports.
# Simplified real-time data access endpoint (FastAPI pattern)
from fastapi import FastAPI, Depends, HTTPException
from fastapi.responses import StreamingResponse
import asyncio
import json
app = FastAPI()
async def verify_data_access_right(user_id: str, device_id: str) -> bool:
"""Verify the user owns or is the user of this connected product."""
# Check device ownership, B2B data access authorisation, etc.
return True # Placeholder
@app.get("/data-act/v1/device/{device_id}/stream")
async def stream_device_data(device_id: str, user_id: str = Depends(get_current_user)):
"""
Art. 3(1): Continuous real-time data access for users.
Data must be provided 'continuously and in real-time' where feasible.
"""
if not await verify_data_access_right(user_id, device_id):
raise HTTPException(403, "No data access right for this device")
async def data_generator():
async for record in device_data_stream(device_id):
yield json.dumps({
"device_id": device_id,
"timestamp": record.ts.isoformat(),
"data": record.payload,
"data_act_basis": "art_3_user_right",
}) + "\n"
return StreamingResponse(data_generator(), media_type="application/x-ndjson")
2. Third-Party Forwarding Pipeline
Article 4 requires not just access but forwarding. Your product needs an authorisation management system where users can create, manage, and revoke permissions for third-party access to their data.
The user journey:
- User navigates to data sharing settings in your product
- User authorises a third-party service by providing its endpoint
- Your system validates the endpoint and creates a forwarding authorisation
- Data flows to the third party automatically per the authorisation
- User can revoke at any time, and forwarding stops immediately
3. Data Format Standardisation
Article 3(1) requires data in "commonly used and machine-readable format." The Data Act does not mandate specific formats but the recitals point toward existing standards where applicable. For IoT data:
- JSON-LD with semantic context where possible
- CSV for tabular sensor data
- Protocol Buffers or CBOR for high-throughput streaming
- NGSI-LD (from ETSI) is emerging as a reference standard for smart device data
Your format choice needs to be documented and justified. "Proprietary binary format" will not survive enforcement scrutiny.
4. Audit Logging for Data Access Events
Every data access event — user accessing their data, third-party forwarding triggered, revocation processed — needs to be logged for enforcement accountability. This is not explicitly stated in Articles 3-6 but is implied by the FRAND and proportionality requirements, and by enforcement practicalities. When an authority asks "did you provide data access when requested?", you need audit records.
Chapter VI: Cloud Service Switching Obligations
Beyond Chapter II, Chapter VI (Articles 23-31) imposes switching and interoperability obligations on cloud service providers. This is where SaaS providers face a separate but related set of obligations.
The core obligation: cloud service providers must not create technical or contractual lock-in that prevents customers from switching to a competing provider. By January 12, 2027, this includes a prohibition on switching fees. But the technical interoperability requirements apply now.
Article 23(1) requires cloud service providers to implement "all the technical and organisational measures necessary to enable customers to switch to a competing provider... in a smooth and seamless manner."
Article 26 requires providers to make their service descriptions, data schemas, and interface specifications publicly available for interoperability purposes.
What this means in practice:
- If you run a managed SaaS with significant customer data, you need documented data export mechanisms that produce interoperable formats
- API compatibility documentation must be publicly available
- Switching workflows must be tested and not involve disproportionate technical friction
- You cannot design your data schemas to be intentionally non-interoperable
The GDPR Intersection
Connected product data is frequently personal data. When the EU Data Act and GDPR both apply, the interaction creates a compliance architecture challenge.
Key interaction points:
Data minimisation vs. data accessibility: GDPR Art. 5(1)(c) requires processing the minimum necessary personal data. Data Act Art. 3 requires making all generated data accessible. Where generated data is personal, you need to minimise what you collect (GDPR) while making accessible everything you do collect (Data Act). These are not contradictory — but they require careful data architecture.
Purpose limitation vs. third-party forwarding: GDPR Art. 5(1)(b) limits processing to specified purposes. Data Act Art. 4 creates a user-instructed forwarding mechanism. The legal basis for the forwarding is the user's explicit instruction — but the receiving third party takes on GDPR data controller responsibilities for any personal data in the forwarded set.
Data subject rights under GDPR vs. user rights under Data Act: Both regimes give users rights over data about them. The mechanisms differ. GDPR access (Art. 15) is an information right. Data Act access (Art. 3) is a data stream right. You need both — they cannot be collapsed into a single implementation.
Practical checklist for the intersection:
- Document which data fields generated by your product contain personal data
- Maintain separate legal bases (GDPR Art. 6/9 for processing, Data Act user instruction for forwarding)
- Implement Data Act forwarding with a GDPR transfer impact assessment for the recipient
- Ensure Article 6 Data Act use restrictions are contractually bound to any third-party recipients receiving personal data (they also need a GDPR data processing agreement)
EU Enforcement in 2026: Germany and France
Both Germany's Bundesnetzagentur (Federal Network Agency) and France's CNIL have data in scope of their enforcement mandates under the Data Act.
Germany's Bundesnetzagentur has enforcement authority over the B2B data sharing provisions and has signalled formal investigations in 2026. The German digital sovereignty agenda has elevated the Data Act to political priority.
France's CNIL has jurisdiction over personal data aspects of Data Act compliance. Given France's track record on tech enforcement (Google fines, Meta fines, and early DSA enforcement actions), CNIL is considered high-probability for early Data Act enforcement activity.
For companies with German or French user bases operating IoT or SaaS products, the risk profile has materially increased in 2026 compared to the first months of application in 2025.
Penalty exposure: Article 40 provides for administrative penalties. Unlike the GDPR which explicitly sets 2% / 4% of global turnover maxima, the Data Act leaves penalty levels to Member State law. German and French authorities can apply their general administrative penalty frameworks, which in practice means enforcement risk is jurisdiction-dependent but significant.
Why EU-Native Infrastructure Matters for Data Act Compliance
The Data Act creates a specific infrastructure risk that the CLOUD Act compounds.
When connected product data or machine data flows through infrastructure controlled by a US-parented company (AWS, Azure, GCP, their SaaS layers), that data is potentially subject to US government access requests under 18 U.S.C. §2703 (CLOUD Act). The Data Act's Article 4 third-party sharing creates an explicit data flow that can be intercepted at the infrastructure level.
More concretely: if you implement Article 4 forwarding on AWS infrastructure, the forwarding pipeline itself runs on servers subject to CLOUD Act jurisdiction. A US government subpoena to AWS could compel disclosure of data you are forwarding to a user's designated third party — data the user explicitly instructed you to send to a specific destination, not to US authorities.
EU-native infrastructure (running on operators without US parent companies, incorporated in EU jurisdiction) eliminates this exposure:
Data Act Art.4 forwarding on US-parent infrastructure:
User device → Your product (AWS) → Third-party endpoint
↓
Potential CLOUD Act intercept at AWS layer
Data Act Art.4 forwarding on EU-native infrastructure:
User device → Your product (sota.io / EU-native) → Third-party endpoint
No US jurisdiction over infrastructure
No CLOUD Act exposure for forwarding pipeline
The legal exposure is not theoretical. The Data Act creates explicit, logged data flows. Those flows, if running on US-parent infrastructure, are the clearest possible target for a CLOUD Act demand: a documented pipeline, known recipients, logged timestamps.
For connected product manufacturers and SaaS providers in regulated sectors (healthcare, energy, industrial), running Data Act compliance infrastructure on EU-native platforms is not just a best practice — it is a reasonable interpretation of the "appropriate technical and organisational measures" requirement under GDPR Art. 32, applied to the Data Act data flows.
Developer Compliance Checklist — EU Data Act Chapter II
Already legally required since September 12, 2025:
- Connected product generates data → User has real-time access API (Art. 3)
- User data access is free of charge, no paywalling (Art. 3(1))
- Data provided in machine-readable, commonly used format (Art. 3(1))
- Business users can request data access on FRAND terms (Art. 5)
- FRAND-compliant data access agreement template available (Art. 5)
- Third-party data forwarding pipeline implemented and documented (Art. 4)
- User authorisation and revocation workflow implemented (Art. 4)
- Art. 6 use restrictions contractually bound to third-party recipients
- Audit log for all data access and forwarding events
- Data Act compliance statement in your product documentation
Cloud Service Providers (Chapter VI):
- Customer data export mechanism documented and publicly described (Art. 26)
- API schema and interface specs publicly available (Art. 26)
- Switching workflow tested end-to-end
- No technical measures that disproportionately impede switching
- Switching fee phase-out plan (prohibited from January 12, 2027)
GDPR Intersection:
- Personal data fields in generated data documented
- GDPR Art. 6/9 legal bases mapped separately from Data Act user rights
- Transfer impact assessment for any cross-border Art. 4 forwarding
- Third-party recipients with personal data have GDPR DPA in place
- Data minimisation review: are you collecting everything Art. 3 requires access to, or more?
See Also
- EU Data Act Chapter VI: Cloud Switching Rights and Zero-Egress Obligations — the companion guide covering Articles 23-31 interoperability and switching fee phase-out
- EU Data Act B2B Data Sharing, Smart Contracts, and AI Training Data — deeper coverage of Articles 4-6 B2B obligations, government access rights (Art.9-15), and AI training data intersection
- EU Region vs. EU Jurisdiction: Why Frankfurt Servers Don't Protect Against US Law — the CLOUD Act exposure explained for developers choosing infrastructure for Data Act compliance pipelines
- Europrivacy as a GDPR Article 46 Transfer Tool (EDPB 2026) — new EU transfer certification mechanism relevant where Data Act Art.4 forwarding involves cross-border personal data flows
- EU-US Data Privacy Framework: GDPR Chapter V Transfer Developer Guide — GDPR Chapter V transfers and DPF self-certification for the GDPR intersection with Data Act Article 4 forwarding
Summary
The EU Data Act has been in effect since September 12, 2025. Its Chapter II obligations — data accessibility by design, real-time user access, third-party forwarding — require architectural changes to connected products and SaaS platforms, not just legal documentation updates.
The enforcement ramp in Germany and France through 2026 converts a theoretical compliance risk into a practical one. The window for "we're working on it" has closed. The window for demonstrating technical compliance is open now.
The core implementation pattern: build a data access API (Art. 3), build a user-controlled third-party forwarding pipeline (Art. 4), bind use restrictions contractually (Art. 6), and document everything (enforcement readiness). Running that infrastructure on EU-native infrastructure without US parent exposure eliminates the CLOUD Act intersection risk on your Data Act compliance pipeline.
sota.io is an EU-incorporated managed PaaS — no US parent company, no CLOUD Act exposure. Deploy your Data Act compliance infrastructure where the Data Act itself applies. Start free →
EU-Native Hosting
Ready to move to EU-sovereign infrastructure?
sota.io is a German-hosted PaaS — no CLOUD Act exposure, no US jurisdiction, full GDPR compliance by design. Deploy your first app in minutes.