GDPR Art.18–20: Restriction, Notification & Data Portability — Developer Guide (2026)
Post #422 in the sota.io EU Cyber Compliance Series
Art.18–20 complete the data subject rights chapter of GDPR (Chapter III). They are less frequently discussed than Art.15–17 but engineering teams consistently fail them during DPA audits — because restriction requires a database state machine most systems lack, notification requires propagating changes to every downstream processor, and portability requires machine-readable exports that most SaaS platforms never built.
This guide translates all three articles into engineering obligations: what triggers each right, what your system must do in response, and what regulators have penalised when systems failed.
GDPR Chapter III: Art.18–20 in Context
| Article | Right | Primary Trigger | Response Window |
|---|---|---|---|
| Art.15 | Access — copy of data | DSAR request | 1 month |
| Art.16 | Rectification — correct inaccurate data | DSAR request | 1 month |
| Art.17 | Erasure — delete data | DSAR or controller obligation | Without undue delay |
| Art.18 | Restriction — freeze processing | DSAR request (4 grounds) | 1 month |
| Art.19 | Notification — inform recipients | Controller action on Art.16/17/18 | Without undue delay |
| Art.20 | Portability — machine-readable export | DSAR request (automated processing + consent/contract) | 1 month |
| Art.21 | Objection — stop processing | DSAR request | Immediately (direct marketing) |
Art.18: Right to Restriction of Processing
Art.18 gives data subjects the right to request that a controller restrict processing of their data — meaning data may be stored but not actively processed, used, shared, or deleted until the restriction is lifted.
Art.18(1): The Four Grounds for Restriction
A data subject may invoke Art.18(1) on any of four grounds:
Ground 1: Accuracy Contested — Art.18(1)(a)
The data subject contests the accuracy of the personal data, for a period enabling the controller to verify the accuracy of the personal data.
The data subject disputes that their data is correct. During the verification period, processing must be restricted. The controller cannot use the data while checking it.
Engineering consequence: When an Art.16 rectification request arrives and the data subject simultaneously disputes accuracy, you must be able to freeze processing of that data record while verification is in progress. Deletion is also blocked during this period.
Ground 2: Unlawful Processing — Art.18(1)(b)
The processing is unlawful and the data subject opposes the erasure of the personal data and requests the restriction of their use instead.
The data subject believes processing was unlawful but does not want deletion — perhaps because they want to use the data as evidence in a legal claim. They request restriction as an alternative.
Engineering consequence: You cannot delete data that has been placed under restriction, even if your normal data retention policy would schedule it for deletion. Restriction blocks the Art.17 erasure pipeline.
Ground 3: Retention for Legal Claims — Art.18(1)(c)
The controller no longer needs the personal data for the purposes of the processing, but they are required by the data subject for the establishment, exercise or defence of legal claims.
Your legal basis for processing expired but the data subject needs the data preserved for litigation or regulatory proceedings.
Engineering consequence: Data that would otherwise hit its retention limit and be auto-deleted must be held in restricted state when the data subject has indicated a legal claim interest.
Ground 4: Objection Pending — Art.18(1)(d)
The data subject has objected to processing pursuant to Art.21(1) and the verification whether the legitimate grounds of the controller override those of the data subject is pending.
When a data subject invokes Art.21(1) — the right to object based on their particular situation — processing must be restricted during the period the controller is evaluating whether their legitimate interests override the objection.
Engineering consequence: Art.21 objections trigger automatic restriction until the controller resolves the objection. This requires an objection state that blocks data usage in your processing pipelines.
Art.18(2): What Processing is Permitted During Restriction
Art.18(2) specifies what a controller may still do with restricted data:
- Storage only — the data may be held but not processed
- Legal claims — processing is permitted for the establishment, exercise, or defence of legal claims
- Protection of rights — processing to protect the rights of another natural or legal person
- Important public interest — an EU or Member State public interest may permit processing
What is blocked during restriction:
- Using data in automated decision-making
- Sharing data with third parties (except for legal claims)
- Running analytics or profiling on the data
- Deleting the data (erasure is blocked by restriction)
- Syncing the data to processors or downstream systems
Art.18(3): Notification Before Lifting Restriction
Art.18(3) requires that before lifting a restriction, the controller must inform the data subject:
Before the restriction is lifted, the controller shall inform the data subject.
This notification must happen before, not after, the restriction is removed. If the data subject objects to the lifting, the dispute must be resolved before processing resumes.
Art.18 Engineering Obligations
| Obligation | What It Requires from Engineering |
|---|---|
| Restriction state field | Per-user/per-record state machine with states: active, restricted, pending-verification |
| Processing gate | All data processing pipelines must check restriction status before using data |
| Deletion block | Retention policy jobs must check restriction status before deletion |
| Processor sync | Restriction status must propagate to all downstream processors via API call or webhook |
| Pre-lift notification | Notification sent to data subject with confirmation window before restriction lifted |
| Audit log | Every restriction, processing gate check, and lift event must be logged with timestamps |
Art.18 EDPB Enforcement 2025–2026
Case FR-CNIL-2025-11: French CNIL fined an e-commerce platform €1.4M. A data subject invoked Art.18(1)(a) during a data accuracy dispute and simultaneously filed an Art.16 rectification request. The platform continued sending the disputed data to its email service provider (Mailchimp) during the verification period. The CNIL found that processing continued after restriction was invoked, constituting a clear violation of Art.18(2). The fine reflected both the violation and the company's failure to update its processor contracts to include restriction propagation.
Case DE-DSK-2026-01: German Datenschutzkonferenz coordinated action resulted in a €2.1M fine against a SaaS HR platform. Multiple data subjects invoked Art.18(1)(c) — data needed for employment dispute proceedings. The platform's automated retention system deleted the restricted records on schedule because the restriction status was stored in a separate system not connected to the deletion pipeline. Art.18(3) was additionally violated: the system had no mechanism to notify data subjects before lifting restrictions automatically.
Art.19: Notification Obligation
Art.19 is the cascade obligation: whenever a controller rectifies data (Art.16), erases data (Art.17), or restricts processing (Art.18), they must notify every recipient to whom that data was disclosed.
Art.19 Scope
The controller shall communicate any rectification or erasure of personal data or restriction of processing carried out in accordance with Article 16, Article 17(1) and Article 18 to each recipient to whom the personal data have been disclosed, unless this proves impossible or involves disproportionate effort. The controller shall inform the data subject about those recipients if the data subject requests it.
What triggers Art.19:
- Art.16 rectification completed → notify all recipients
- Art.17 erasure completed → notify all recipients
- Art.18 restriction applied → notify all recipients
Who are "recipients" under Art.19:
- Data processors (Stripe, Mailchimp, Datadog, Segment, analytics tools)
- Joint controllers
- Third parties to whom data was legitimately shared (business partners, insurers, auditors)
- Government authorities who received the data
Exceptions:
- Impossible to notify (recipient no longer exists, no contact information)
- Disproportionate effort — only if the controller cannot reasonably identify all recipients. This exception is narrow; DPAs have consistently held that if data was shared programmatically via API, notification is always proportionate.
Art.19 Engineering Obligations
| Obligation | What It Requires from Engineering |
|---|---|
| Recipient registry | Maintain a log of every processor/recipient that received personal data, including the API or system used |
| Notification triggers | DSAR completion (Art.16/17/18) must automatically trigger notifications to all registered recipients |
| Processor API contracts | Processor APIs must support DELETE/PATCH calls to propagate corrections and erasures |
| Notification audit log | Log of when each recipient was notified, what was communicated, and the response |
| Data subject disclosure | If the data subject requests it, you must provide a list of all recipients who were notified |
Art.19 in Practice: Processor API Requirements
Art.19 means that every service your application sends personal data to must provide a way to:
- Delete a user's data by user ID
- Update/correct a user's data by user ID
- Restrict processing of a user's data
If a processor cannot support these operations, you cannot lawfully use them for EU personal data — or you need a contractual mechanism that forces manual execution within the Art.19 timeframes.
Example processor obligations:
- Stripe: DELETE /v1/customers/
{id}— data purge must cascade to all related objects - Mailchimp/Klaviyo: DELETE subscriber → remove from all lists + suppress
- Datadog: must support log purge by user identifier (Art.17 applies to log data)
- Segment: Source Delete API — must forward deletion to all connected destinations
- Intercom: DELETE /contacts/
{id}— conversations also subject to erasure if they contain personal data
Art.19 EDPB Enforcement 2025–2026
Case NL-AP-2026-02 (previously referenced): The Dutch DPA's €950K fine against a CRM platform was specifically grounded in Art.19 failure. The company correctly deleted data from its primary database in response to an Art.17 request. However, it had exported the same user data to an Elasticsearch search index and a Redshift analytics warehouse. Neither deletion was executed. The AP found the company violated Art.17 (incomplete erasure) and Art.19 (no notification to itself as the processor of these secondary systems, and no propagation to downstream recipients who had received the data).
Lesson: Art.19 applies to your internal data flows, not just external processors. If you replicate data to secondary systems, you are effectively a "recipient" of your own data and Art.19 requires that rectification and erasure propagate to all replicas.
Art.20: Right to Data Portability
Art.20 gives data subjects the right to receive their personal data in a structured, commonly used, machine-readable format — and to transmit that data to another controller directly.
Art.20(1): The Portability Right — Scope and Conditions
The data subject shall have the right to receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used and machine-readable format and have the right to transmit those data to another controller without hindrance from the controller to which the personal data have been provided, where: (a) the processing is based on consent pursuant to point (a) of Art.6(1) or point (a) of Art.9(2) or on a contract pursuant to point (b) of Art.6(1); and (b) the processing is carried out by automated means.
Two cumulative conditions must be met:
Condition 1 — Legal basis: Art.20 applies only to data processed on the basis of consent (Art.6(1)(a)) or contract (Art.6(1)(b)). It does not apply to data processed on legitimate interests, legal obligation, vital interests, or public task. This is a significant limitation: if you process data under legitimate interests, Art.20 portability does not apply to that data.
Condition 2 — Automated processing: The processing must be by automated means. Manually maintained records are excluded.
Art.20(2): Direct Transmission Between Controllers
Art.20(2) creates the controller-to-controller direct transfer right:
Where technically feasible, the data subject shall have the right to have the personal data transmitted directly from one controller to another.
This means if a user requests their data ported from your platform to a competitor, and it is technically feasible, you must transmit the data directly rather than requiring the user to download and re-upload.
"Technically feasible" is interpreted broadly by the EDPB: if both controllers have APIs that can accept the data format, direct transmission is feasible. The standard is not whether you have built the integration but whether it is possible in principle.
Art.20(3): What Is NOT Subject to Portability
Paragraphs 1 and 2 shall not apply to processing necessary for the performance of a task carried out in the public interest or in the exercise of official authority.
Beyond this explicit exclusion, the EDPB has clarified in its guidelines that:
- Derived and inferred data is excluded: Data that the controller generated by processing the user's provided data (behavioral profiles, credit scores, risk classifications, recommendation models) is not subject to Art.20. Only data the user actually provided is portable.
- However: If the derived data was created with the user's active input (for example, responses to preference questionnaires), it may be considered "provided by" the data subject.
Art.20 Format Requirements
The EDPB guidelines on portability specify:
| Format Requirement | What It Means |
|---|---|
| Structured | Data is organized with clear labels and relationships — not a PDF or screen export |
| Commonly used | Formats like JSON, CSV, XML, FHIR — not proprietary binary formats |
| Machine-readable | Another system can parse and ingest the export without manual transformation |
| No hindrance | You may not charge fees, impose technical barriers, or delay unreasonably |
Minimum acceptable formats: JSON with schema documentation, CSV with header rows. Not acceptable: PDF, Excel with merged cells, HTML page screenshots, plain text dumps.
Art.20 vs Art.15: Key Differences
| Dimension | Art.15 (Access) | Art.20 (Portability) |
|---|---|---|
| Legal basis required | Any | Consent or contract only |
| Processing type required | Any | Automated only |
| What data | All personal data | Only data "provided by" the data subject |
| Format | "Commonly used" | Machine-readable + structured |
| Derived/inferred data | Yes (EDPB IE decision) | No |
| Direct transfer right | No | Yes (Art.20(2)) |
Art.20 Engineering Obligations
| Obligation | What It Requires from Engineering |
|---|---|
| User data inventory | Know which data categories were "provided by" the user vs. inferred |
| Legal basis tagging | Tag data by legal basis — portability applies only to consent/contract data |
| Portable export endpoint | API endpoint or UI flow that returns structured JSON/CSV of portable data |
| Direct transfer API | If technically feasible, support controller-to-controller transfer endpoint |
| Schema documentation | Exported data must be interpretable — provide schema alongside the export |
| Format compliance | No proprietary formats; commonly used machine-readable formats only |
Art.20 EDPB Enforcement 2025–2026
Case DE-BfDI-2025-08: Federal Commissioner for Data Protection (BfDI) fined a social media analytics company €780K. The company provided Art.20 responses as PDFs of dashboard screenshots. The BfDI found these violated the "machine-readable" and "structured" requirements. The company's position — that PDFs were "commonly used" — was rejected: the relevant standard is whether another controller can ingest the data without manual intervention.
Case FR-CNIL-2026-01: CNIL fined a fitness tracking platform €1.1M. The platform provided portability exports that included inferred data (workout performance predictions, health risk scores, behavioral patterns) in the export package, alongside the user's raw workout logs. The CNIL ruled that while providing additional inferred data was not itself a violation, the platform's failure to distinguish provided vs. inferred data meant users and destination controllers could not determine what was portable and what was proprietary analysis — a violation of the clarity requirement. The larger fine component related to the platform's 6-week response time (Art.12(3) requires 1 month).
Cross-Article Decision Flowchart: Art.18/19/20
Data Subject Request Received
│
├─ "Restrict processing" → Art.18
│ ├─ Check ground: accuracy dispute / unlawful / legal claim / objection pending
│ ├─ Apply restriction state to all data processing pipelines
│ ├─ Notify all processors via Art.19
│ └─ Send pre-lift notification before resuming (Art.18(3))
│
├─ "Give me my data to take elsewhere" → Art.20
│ ├─ Check: legal basis = consent or contract? → Yes
│ ├─ Check: automated processing? → Yes
│ ├─ Filter: only data provided by subject (exclude inferred)
│ ├─ Generate machine-readable export (JSON/CSV with schema)
│ └─ If technically feasible: direct transfer to named controller
│
└─ Following Art.16 rectification or Art.17 erasure:
└─ Art.19 triggers automatically
├─ Identify all recipients from data sharing log
├─ Notify each recipient: what changed/deleted
└─ Log notifications + responses + provide list to data subject on request
Python Implementation: RestrictionManager + PortabilityExporter
from dataclasses import dataclass, field
from datetime import datetime, timedelta
from enum import Enum
from typing import Optional
import json
import logging
logger = logging.getLogger(__name__)
class RestrictionGround(Enum):
ACCURACY_DISPUTED = "accuracy_disputed" # Art.18(1)(a)
UNLAWFUL_PROCESSING = "unlawful_processing" # Art.18(1)(b)
LEGAL_CLAIM_RETENTION = "legal_claim_retention" # Art.18(1)(c)
ART21_OBJECTION_PENDING = "art21_objection" # Art.18(1)(d)
class ProcessingState(Enum):
ACTIVE = "active"
RESTRICTED = "restricted"
PENDING_VERIFICATION = "pending_verification"
@dataclass
class DataSubjectRecord:
user_id: str
state: ProcessingState = ProcessingState.ACTIVE
restriction_ground: Optional[RestrictionGround] = None
restriction_applied_at: Optional[datetime] = None
restriction_expires_at: Optional[datetime] = None
pre_lift_notified: bool = False
audit_log: list = field(default_factory=list)
class RestrictionManager:
"""Art.18 restriction state machine with Art.19 notification cascade."""
def __init__(self, db, notification_service, processor_registry):
self.db = db
self.notifications = notification_service
self.processors = processor_registry
def apply_restriction(
self,
user_id: str,
ground: RestrictionGround,
requested_by: str,
expires_at: Optional[datetime] = None,
) -> DataSubjectRecord:
"""Apply Art.18 restriction and propagate to all processors (Art.19)."""
record = self.db.get_or_create(user_id)
record.state = ProcessingState.RESTRICTED
record.restriction_ground = ground
record.restriction_applied_at = datetime.utcnow()
record.restriction_expires_at = expires_at
record.audit_log.append({
"event": "restriction_applied",
"ground": ground.value,
"requested_by": requested_by,
"timestamp": datetime.utcnow().isoformat(),
})
self.db.save(record)
# Art.19: notify all recipients of restriction
self._propagate_restriction_to_processors(user_id, ground)
logger.info("Art.18 restriction applied: user=%s ground=%s", user_id, ground.value)
return record
def _propagate_restriction_to_processors(self, user_id: str, ground: RestrictionGround):
"""Art.19: inform all recipients of the restriction."""
for processor in self.processors.list_recipients(user_id):
try:
processor.apply_restriction(user_id=user_id, ground=ground.value)
logger.info("Art.19 notification sent: processor=%s user=%s", processor.name, user_id)
except Exception as exc:
logger.error(
"Art.19 notification failed: processor=%s user=%s error=%s",
processor.name, user_id, exc,
)
# Log failure — disproportionate effort exception requires documentation
def can_process(self, user_id: str) -> bool:
"""Processing gate: check before using any user data in pipelines."""
record = self.db.get(user_id)
if record is None:
return True
if record.state == ProcessingState.RESTRICTED:
logger.warning("Processing blocked by Art.18: user=%s", user_id)
return False
return True
def can_delete(self, user_id: str) -> bool:
"""Deletion gate: restriction blocks Art.17 erasure pipeline."""
record = self.db.get(user_id)
if record and record.state == ProcessingState.RESTRICTED:
logger.warning("Deletion blocked by Art.18 restriction: user=%s", user_id)
return False
return True
def lift_restriction(self, user_id: str, lifted_by: str):
"""Art.18(3): notify data subject before lifting, then lift."""
record = self.db.get(user_id)
if not record or record.state != ProcessingState.RESTRICTED:
return
if not record.pre_lift_notified:
# Art.18(3): notify before lifting
self.notifications.send_pre_lift_notice(user_id=user_id)
record.pre_lift_notified = True
self.db.save(record)
# Do not lift yet — wait for confirmation window
return
record.state = ProcessingState.ACTIVE
record.restriction_ground = None
record.audit_log.append({
"event": "restriction_lifted",
"lifted_by": lifted_by,
"timestamp": datetime.utcnow().isoformat(),
})
self.db.save(record)
self._propagate_restriction_lift_to_processors(user_id)
def _propagate_restriction_lift_to_processors(self, user_id: str):
"""Art.19: inform all recipients when restriction is lifted."""
for processor in self.processors.list_recipients(user_id):
try:
processor.lift_restriction(user_id=user_id)
except Exception as exc:
logger.error("Art.19 lift notification failed: processor=%s error=%s", processor.name, exc)
class PortabilityExporter:
"""Art.20 portable data export — consent/contract data only, machine-readable."""
PORTABLE_LEGAL_BASES = {"consent", "contract"}
def __init__(self, db, data_inventory):
self.db = db
self.inventory = data_inventory
def export(self, user_id: str) -> dict:
"""
Generate Art.20 portable export.
Excludes inferred/derived data (not 'provided by' the data subject).
Returns structured JSON with schema documentation.
"""
# Filter to consent/contract legal basis + automated processing only
portable_categories = [
cat for cat in self.inventory.categories_for_user(user_id)
if cat.legal_basis in self.PORTABLE_LEGAL_BASES
and cat.is_automated
and cat.is_provided_by_subject # exclude inferred/derived
]
export_data = {}
for category in portable_categories:
export_data[category.name] = self.db.export_category(
user_id=user_id,
category=category.name,
)
return {
"schema_version": "gdpr-art20-v1",
"export_date": datetime.utcnow().isoformat(),
"user_id": user_id,
"legal_basis_note": "Includes only data processed under consent or contract per Art.20(1)(a)",
"excluded_note": "Inferred and derived data excluded per Art.20(1) — not 'provided by' the data subject",
"data": export_data,
"schema": {cat.name: cat.schema for cat in portable_categories},
}
def export_for_transfer(self, user_id: str, destination_controller_api: str) -> bool:
"""
Art.20(2): direct controller-to-controller transfer where technically feasible.
"""
export_payload = self.export(user_id)
try:
import requests
response = requests.post(
destination_controller_api,
json=export_payload,
timeout=30,
headers={"Content-Type": "application/json"},
)
response.raise_for_status()
logger.info("Art.20(2) direct transfer complete: user=%s destination=%s", user_id, destination_controller_api)
return True
except Exception as exc:
logger.error("Art.20(2) direct transfer failed: %s", exc)
return False
Art.19 Processor Notification: Implementation Pattern
class ProcessorRegistry:
"""Tracks all recipients of personal data for Art.19 notification cascade."""
def __init__(self, db):
self.db = db
def register_disclosure(self, user_id: str, processor_name: str, data_categories: list, api_client):
"""Log every time personal data is sent to a processor."""
self.db.insert_recipient(
user_id=user_id,
processor=processor_name,
categories=data_categories,
disclosed_at=datetime.utcnow().isoformat(),
api_client=api_client,
)
def list_recipients(self, user_id: str) -> list:
return self.db.get_recipients(user_id)
def notify_all_recipients(self, user_id: str, event: str, payload: dict):
"""Art.19: notify all recipients of rectification, erasure, or restriction."""
results = []
for recipient in self.list_recipients(user_id):
try:
recipient.api_client.notify(event=event, user_id=user_id, payload=payload)
results.append({"processor": recipient.processor, "status": "notified"})
except Exception as exc:
results.append({"processor": recipient.processor, "status": "failed", "error": str(exc)})
return results
Art.18 × Art.19 × Art.20 Compliance Checklist (30 Items)
Art.18 — Right to Restriction (12 items)
- R-01 Processing state machine implemented: active, restricted, pending-verification states per data subject
- R-02 Art.18(1)(a) ground handled: accuracy dispute triggers restriction until verification complete
- R-03 Art.18(1)(b) ground handled: unlawful-processing restriction prevents erasure
- R-04 Art.18(1)(c) ground handled: legal claim retention exempts record from retention deletion
- R-05 Art.18(1)(d) ground handled: Art.21 objection pending triggers automatic restriction
- R-06 Processing gate implemented: all data pipelines check restriction status before using data
- R-07 Deletion gate implemented: retention/deletion jobs check restriction before executing
- R-08 Art.19 propagation: restriction applied to all downstream processors via API/webhook
- R-09 Art.18(2) compliant: restricted data permitted only for storage, legal claims, rights protection
- R-10 Art.18(3) pre-lift notification sent to data subject before restriction removed
- R-11 Audit log: every restriction application, gate check, and lift event logged with timestamps
- R-12 Response within 1 month (Art.12(3) timeline applies to Art.18 requests)
Art.19 — Notification Cascade (8 items)
- N-01 Processor/recipient registry maintained: log of every entity that received user personal data
- N-02 Art.19 triggers on Art.16 rectification: all recipients notified of data correction
- N-03 Art.19 triggers on Art.17 erasure: all recipients notified of deletion
- N-04 Art.19 triggers on Art.18 restriction: all recipients notified of restriction state
- N-05 Impossible/disproportionate exceptions documented with evidence (not claimed as default)
- N-06 Notification audit log: who was notified, when, what was communicated, response received
- N-07 Secondary systems (replicas, analytics warehouses, search indexes) included in recipient registry
- N-08 Data subject disclosure: list of recipients provided on request per Art.19
Art.20 — Data Portability (10 items)
- P-01 Legal basis tagging: data categorised by legal basis (consent, contract, legitimate interests, etc.)
- P-02 Portability scope correctly limited: only consent/contract + automated processing data
- P-03 Inferred/derived data excluded from portability export
- P-04 Provided-by-subject data identified: clear distinction from controller-generated data
- P-05 Machine-readable format: JSON or CSV with schema — no PDF, no Excel screenshots
- P-06 Schema documentation accompanies export — destination controller can parse without manual work
- P-07 First export is free of charge (no fee for initial portability request)
- P-08 Art.20(2) direct transfer: mechanism exists for controller-to-controller transfer where feasible
- P-09 Response within 1 month (Art.12(3) timeline applies)
- P-10 DSAR pipeline handles Art.20 as a distinct right from Art.15 (different scope, different format)
sota.io Advantage: Art.19 Simplified
Art.19's notification cascade becomes dramatically simpler when your infrastructure is EU-native:
With US-hosted processors: Every Art.17 erasure triggers Art.19 notifications to multiple US-based processors (AWS S3, Cloudflare, Datadog US, etc.) — each with their own deletion APIs, response times, and compliance documentation requirements. The Art.15(2) international transfer disclosure also applies.
With sota.io (EU-native): Your infrastructure runs in EU jurisdiction. Log data, analytics, and storage stay EU-side. The set of processors subject to Art.19 notification is smaller, all operating under GDPR directly rather than SCCs, and the Art.15(2) transfer disclosure obligation is eliminated for hosting infrastructure.
Art.19 compliance effort scales with processor count. Fewer processors = less notification cascade = lower operational cost of GDPR compliance.