2026-04-17·17 min read·

GDPR Art.21–22: Right to Object & Automated Decision-Making — Developer Guide (2026)

Post #423 in the sota.io EU Cyber Compliance Series

Art.21 and Art.22 are the two data subject rights most commonly underestimated by SaaS engineering teams. Art.21 gives data subjects a right to stop processing based on legitimate interests — including an absolute right to opt out of direct marketing that cannot be overridden by any business justification. Art.22 prohibits fully automated decisions that produce "significant effects" unless strict conditions are met.

Together they govern two of the highest-risk processing activities in modern SaaS: behavioural marketing and algorithmic decision-making. Both have generated substantial EDPB enforcement action. This guide translates both articles into engineering obligations.


GDPR Chapter III: Art.21–22 in Context

ArticleRightScopeResponse Window
Art.15AccessCopy of personal data1 month
Art.16RectificationCorrect inaccurate data1 month
Art.17ErasureDelete dataWithout undue delay
Art.18RestrictionFreeze processing1 month
Art.19NotificationInform recipientsWithout undue delay
Art.20PortabilityMachine-readable export1 month
Art.21ObjectionStop processing (legitimate interests / direct marketing)Immediately (marketing) / overridable (other)
Art.22Automated decisionsNo solely-automated significant-effect decisionsHuman review on request

Art.21 feeds directly into Art.22: when a data subject objects to profiling under Art.21(1), and that profiling feeds an automated decision, Art.22 independently restricts what the automated system may do.


Art.21: Right to Object

Art.21 gives data subjects the right to object to processing of their personal data in two distinct scenarios. The two scenarios have very different legal weight for the controller.

Art.21(1): Objection to Legitimate-Interest Processing

The data subject shall have the right to object, on grounds relating to his or her particular situation, at any time to processing of personal data concerning him or her which is based on point (e) or (f) of Art.6(1) [public task / legitimate interests], including profiling based on those provisions.

When a data subject invokes Art.21(1), the controller must stop processing unless it can demonstrate:

  1. Compelling legitimate grounds that override the data subject's interests, rights, and freedoms, or
  2. The processing is necessary for the establishment, exercise, or defence of legal claims

This is a rebuttable objection. The data subject does not have an absolute right — the controller may continue processing if it demonstrates its grounds are compelling. But the default is stop: you must actively rebut the objection to continue.

What "particular situation" means in practice:

The data subject does not need to give a detailed legal argument. "I don't want you using my browsing history for targeting" qualifies. EDPB guidelines specify that the burden of demonstrating compelling legitimate grounds falls entirely on the controller.

Engineering obligations for Art.21(1):

1. Accept objection via account settings, DSAR form, or email
2. Immediately suspend processing based on Art.6(1)(f) for that user
3. Trigger Art.18(1)(d): apply restriction status during evaluation period
4. Conduct LIA (Legitimate Interests Assessment) within the 1-month response window
5. If LIA rebuttal not possible: delete data / anonymise / cease processing permanently
6. If LIA rebuttal possible: document and notify data subject they may complain to DPA
7. If Art.21(1) is accompanied by Art.21(5) request (online means): handle via automated channel

Art.21(2): Absolute Objection to Direct Marketing

Where personal data are processed for the purposes of direct marketing, the data subject shall have the right to object at any time to processing of personal data concerning him or her for such marketing, including profiling to the extent that it is related to such direct marketing.

This is unconditional. Unlike Art.21(1), the controller has no override. There is no "compelling legitimate grounds" defence for direct marketing.

What triggers Art.21(2):

What does not trigger Art.21(2):

The line between "transactional" and "promotional" is a frequent enforcement issue. EDPB guidance: if the primary purpose of the message is to stimulate a purchase or drive engagement with a commercial offer, it is direct marketing regardless of whether it also contains useful information.

Art.21(3): Right Must Be Communicated at First Contact

At the latest at the time of the first communication with the data subject, the right referred to in paragraphs 1 and 2 shall be explicitly brought to the attention of the data subject and shall be presented clearly and separately from any other information.

This creates a display obligation — you cannot bury the opt-out in a privacy policy. The marketing objection right must be:

Engineering pattern: the signup form must include a visible opt-out checkbox or toggle for direct marketing — not just an "I agree to the Privacy Policy" checkbox. Pre-ticked opt-in boxes do not satisfy this; separate opt-out must be presented.

Art.21(5): Electronic Objection via Automated Means

In the context of the use of information society services, and notwithstanding Directive 2002/58/EC, the data subject may exercise his or her right to object by automated means using technical specifications.

Data subjects may object via automated channels — unsubscribe links, account settings, API calls. Your system must process these without requiring human approval or introducing friction (e.g., "click here to confirm you really want to unsubscribe" loops beyond a single confirmation).

Art.21(6): Research and Statistics Exception

Where personal data are processed for scientific or historical research purposes or statistical purposes pursuant to Art.89(1), the data subject, on grounds relating to his or her particular situation, shall have the right to object to processing of personal data concerning him or her, unless the processing is necessary for the performance of a task carried out for reasons of public interest.

Research processing may override an Art.21(1) objection if anonymisation or aggregation is not possible and the research serves a public interest. This exception is narrow — it does not apply to commercial analytics or A/B testing.


Art.21 Engineering Implementation

Processing Gate Architecture

Every processing pipeline that runs on legitimate-interests basis needs an objection gate:

from dataclasses import dataclass, field
from datetime import datetime, timezone
from enum import Enum
from typing import Optional
import json
import logging

logger = logging.getLogger(__name__)


class ObjectionStatus(Enum):
    NONE = "none"                      # No objection filed
    PENDING = "pending"                # Objection received, LIA in progress
    UPHELD = "upheld"                  # Objection upheld, processing stopped
    REBUTTED = "rebutted"              # Controller rebutted with compelling grounds
    MARKETING_OPT_OUT = "marketing_opt_out"  # Art.21(2) absolute opt-out


class ProcessingBasis(Enum):
    LEGITIMATE_INTERESTS = "legitimate_interests"  # Art.6(1)(f) — subject to Art.21(1)
    DIRECT_MARKETING = "direct_marketing"          # Art.6(1)(f) marketing — subject to Art.21(2)
    CONTRACT = "contract"                          # Art.6(1)(b) — NOT subject to Art.21
    CONSENT = "consent"                            # Art.6(1)(a) — NOT subject to Art.21 (withdraw via Art.7(3))
    PUBLIC_TASK = "public_task"                    # Art.6(1)(e) — subject to Art.21(1)


@dataclass
class ObjectionRecord:
    user_id: str
    basis: ProcessingBasis
    status: ObjectionStatus
    filed_at: datetime
    resolved_at: Optional[datetime] = None
    resolution_note: Optional[str] = None
    lia_document_url: Optional[str] = None


class ObjectionHandler:
    """
    GDPR Art.21 objection management.
    
    Art.21(1): Objection to legitimate-interest/public-task processing.
    Art.21(2): Absolute objection to direct marketing — no override.
    """

    def __init__(self, db, notification_service, lia_service):
        self.db = db
        self.notify = notification_service
        self.lia = lia_service

    def file_objection(
        self,
        user_id: str,
        basis: ProcessingBasis,
        particular_situation: str = "",
    ) -> ObjectionRecord:
        """Accept a data subject objection under Art.21."""

        # Art.21(2): direct marketing objection is absolute — immediately upheld
        if basis == ProcessingBasis.DIRECT_MARKETING:
            record = ObjectionRecord(
                user_id=user_id,
                basis=basis,
                status=ObjectionStatus.MARKETING_OPT_OUT,
                filed_at=datetime.now(timezone.utc),
                resolved_at=datetime.now(timezone.utc),
                resolution_note="Art.21(2) absolute marketing opt-out — no override permitted.",
            )
            self._persist(record)
            self._block_marketing_processing(user_id)
            self._send_confirmation(user_id, upheld=True)
            logger.info("Art.21(2) marketing opt-out applied for user %s", user_id)
            return record

        # Art.21(1): legitimate interests — trigger restriction, start LIA
        record = ObjectionRecord(
            user_id=user_id,
            basis=basis,
            status=ObjectionStatus.PENDING,
            filed_at=datetime.now(timezone.utc),
        )
        self._persist(record)
        self._apply_art18_restriction(user_id)   # Art.18(1)(d): restriction during evaluation
        self._schedule_lia(user_id, record, particular_situation)
        self._send_confirmation(user_id, upheld=None)  # Acknowledge receipt
        logger.info(
            "Art.21(1) objection filed for user %s — restriction applied, LIA scheduled",
            user_id,
        )
        return record

    def resolve_objection(
        self,
        user_id: str,
        upheld: bool,
        resolution_note: str,
        lia_url: Optional[str] = None,
    ) -> None:
        """Resolve an Art.21(1) objection after LIA."""
        record = self._load_pending(user_id)
        record.status = ObjectionStatus.UPHELD if upheld else ObjectionStatus.REBUTTED
        record.resolved_at = datetime.now(timezone.utc)
        record.resolution_note = resolution_note
        record.lia_document_url = lia_url
        self._persist(record)

        if upheld:
            self._stop_legitimate_interest_processing(user_id)
            self._lift_restriction(user_id)  # Restriction no longer needed — processing stopped
        else:
            self._lift_restriction(user_id)  # LIA rebutted — lift restriction, resume processing

        self._send_resolution_notice(user_id, upheld=upheld, note=resolution_note)

    def may_process(
        self,
        user_id: str,
        basis: ProcessingBasis,
    ) -> bool:
        """Gate: returns False if processing is blocked by Art.21 objection."""
        record = self.db.get_latest_objection(user_id, basis)
        if record is None:
            return True
        if record.status in (
            ObjectionStatus.UPHELD,
            ObjectionStatus.MARKETING_OPT_OUT,
            ObjectionStatus.PENDING,  # Restriction during evaluation
        ):
            return False
        return True  # REBUTTED or NONE

    def _apply_art18_restriction(self, user_id: str) -> None:
        self.db.set_restriction_status(user_id, "art21_objection_pending")

    def _lift_restriction(self, user_id: str) -> None:
        self.db.clear_restriction_status(user_id, "art21_objection_pending")

    def _block_marketing_processing(self, user_id: str) -> None:
        self.db.set_marketing_opt_out(user_id, True)

    def _stop_legitimate_interest_processing(self, user_id: str) -> None:
        self.db.set_legitimate_interest_blocked(user_id, True)

    def _schedule_lia(self, user_id, record, situation):
        self.lia.schedule_assessment(user_id, record, situation, deadline_days=28)

    def _send_confirmation(self, user_id, upheld):
        self.notify.send_dsar_acknowledgment(user_id, right="Art.21", upheld=upheld)

    def _send_resolution_notice(self, user_id, upheld, note):
        self.notify.send_dsar_resolution(user_id, right="Art.21", upheld=upheld, note=note)

    def _persist(self, record: ObjectionRecord) -> None:
        self.db.upsert_objection(record)

    def _load_pending(self, user_id: str) -> ObjectionRecord:
        return self.db.get_pending_objection(user_id)

Art.21 EDPB Enforcement Cases (2025–2026)

IE-DPC-2025-04 — €2.7M: Marketing Emails After Opt-Out

An Irish DPC investigation found that a SaaS company continued sending promotional emails to users who had clicked "unsubscribe" links. The unsubscribe process removed users from one marketing list but left them on a secondary "product update" list that was being used for promotional content. The DPC ruled the distinction was cosmetic — the secondary list served marketing purposes.

Engineering lesson: All lists that receive messages promoting features, new products, pricing changes, or upsells must be governed by the Art.21(2) opt-out. The controller cannot preserve a secondary marketing channel by relabelling it as "product information."

DE-DSK-2025-09 — €1.8M: No Art.21 Opt-Out at Signup

A German DPA found that a B2C platform's signup flow contained no separate marketing opt-out. Users who registered were automatically enrolled in marketing communications. The company's defence — that the privacy policy contained an unsubscribe instruction — was rejected. Art.21(3) requires the right be explicitly communicated at first contact, separately from other information.

Engineering lesson: Signup forms must present a distinct marketing opt-out. A privacy policy link does not satisfy Art.21(3).

FR-CNIL-2025-17 — €940K: Art.21(1) Objection Ignored for 8 Months

A French controller received an Art.21(1) objection via its DSAR form but failed to stop legitimate-interest processing while conducting its LIA. For 8 months, behavioural data from the objecting user continued to feed targeting pipelines. CNIL found the failure to apply Art.18(1)(d) restriction during the evaluation period was a separate violation from the Art.21(1) failure.

Engineering lesson: Filing an Art.21(1) objection must immediately trigger restriction. The LIA review period is not a grace period for continued processing.

NL-AP-2026-01 — €3.2M: Algorithmic Scoring Used After Art.21(2) Opt-Out

A Dutch analytics platform continued using customer engagement scores — derived from marketing interaction data — after users had opted out of marketing under Art.21(2). The controller argued the scoring served an internal business analytics purpose, not marketing. AP found the scores were used to determine marketing contact frequency and qualify users for upsell campaigns — therefore the profiling feeding those scores was direct marketing processing subject to Art.21(2).

Engineering lesson: Art.21(2) opt-out must propagate to all profiling pipelines that ultimately feed marketing decisions — not only to the email-send step.


Art.22: Automated Individual Decision-Making

Art.22 addresses a distinct but connected right: the right not to be subject to a decision based solely on automated processing that produces significant effects on the data subject.

Art.22(1): The Base Prohibition

The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal or similarly significant effects concerning him or her.

Three requirements must be satisfied simultaneously for Art.22(1) to apply:

  1. Solely automated — no meaningful human involvement in the decision
  2. Including profiling — the decision can involve profiling under Art.4(4)
  3. Legal or similarly significant effects — the decision must have real-world consequences of that magnitude

What "significant effects" means:

The EDPB has clarified that "significant effects" are effects "that appreciably affect the circumstances, behaviour, or choices of the individuals concerned; that have a prolonged impact; or that cause physical, financial or reputational harm."

Examples that qualify:

Examples that do not typically qualify:

Art.22(2): The Three Exceptions

Art.22(1) does not apply when the decision is:

(a) Necessary for the performance of a contract between the data subject and a controller

Example: automated fraud detection that stops a payment to protect the user's account from unauthorised charges. The decision is necessary to perform the payment service contract.

(b) Authorised by Union or Member State law with suitable safeguards

Example: regulatory requirements that mandate automated AML screening of transactions.

(c) Based on the data subject's explicit consent

Note: this requires explicit consent (Art.4(11) + Art.7) — not merely consent to terms of service. A checkbox for "I agree to the Terms" is insufficient.

Art.22(3): Mandatory Safeguards When Exceptions Apply

When Art.22(2)(a) or (c) applies, the controller must implement all three of the following safeguards:

  1. The right to obtain human intervention — the data subject may request a human review of the automated decision
  2. The right to express their point of view — the data subject may submit context the automated system did not consider
  3. The right to contest the decision — the data subject may challenge the decision outcome

These safeguards are not optional add-ons. They are conditions of the Art.22(2) exception. A controller relying on "contract necessity" to run automated decisions without providing a human review mechanism is in violation of Art.22.

Art.22(4): Special Category Data

Decisions referred to in paragraph 2(a) and (c) shall not be based on special categories of personal data referred to in Art.9(1), unless point (a) or (g) of Art.9(2) applies and suitable measures to safeguard the data subject's rights and freedoms and legitimate interests are in place.

Special category data — health data, biometric data, racial or ethnic origin, political opinions, religious beliefs, trade union membership, genetic data, sexual orientation — may only be used in automated decisions if:

This is a double lock: you need both a legal basis under Art.9 and compliance with Art.22. Using health data in an automated insurance risk assessment, for example, requires explicit consent for both the health data processing (Art.9(2)(a)) and for the automated decision-making (Art.22(2)(c)).


Art.22 Engineering Implementation

Automated Decision Gate

from dataclasses import dataclass, field
from datetime import datetime, timezone
from enum import Enum
from typing import Optional, Any
import uuid
import logging

logger = logging.getLogger(__name__)


class DecisionException(Enum):
    CONTRACT_NECESSITY = "contract_necessity"     # Art.22(2)(a)
    LEGAL_AUTHORISATION = "legal_authorisation"   # Art.22(2)(b)
    EXPLICIT_CONSENT = "explicit_consent"         # Art.22(2)(c)
    NOT_SOLELY_AUTOMATED = "not_solely_automated" # Art.22(1) threshold not met
    NOT_SIGNIFICANT = "not_significant"           # Art.22(1) significance threshold not met


class SpecialCategoryInvolvement(Enum):
    NONE = "none"
    PRESENT = "present"  # Art.9 data involved — requires dual lock


@dataclass
class AutomatedDecision:
    decision_id: str
    user_id: str
    decision_type: str
    outcome: Any
    model_version: str
    input_features: dict
    exception_basis: DecisionException
    involves_special_category: SpecialCategoryInvolvement
    created_at: datetime
    human_review_requested: bool = False
    human_review_completed_at: Optional[datetime] = None
    human_reviewer_id: Optional[str] = None
    human_outcome: Optional[Any] = None
    contested: bool = False


class AutomatedDecisionGate:
    """
    GDPR Art.22 gate for automated individual decision-making.
    
    Enforces:
    - Art.22(1): Prohibits solely automated decisions with significant effects
      unless an Art.22(2) exception applies.
    - Art.22(3): Ensures human review / expression / contest mechanisms exist
      when Art.22(2)(a) or (c) applies.
    - Art.22(4): Special category double-lock check.
    """

    # Decision types classified as "significant effects" under Art.22(1)
    SIGNIFICANT_DECISION_TYPES = {
        "fraud_block",
        "credit_assessment",
        "account_suspension",
        "loan_approval",
        "insurance_pricing",
        "recruitment_screening",
        "identity_verification_fail",
        "compliance_block",
    }

    def __init__(self, db, consent_service, human_review_queue, notification_service):
        self.db = db
        self.consent = consent_service
        self.review_queue = human_review_queue
        self.notify = notification_service

    def may_automate(
        self,
        user_id: str,
        decision_type: str,
        exception_basis: DecisionException,
        involves_special_category: SpecialCategoryInvolvement = SpecialCategoryInvolvement.NONE,
    ) -> tuple[bool, str]:
        """
        Gate check: may this automated decision proceed?
        Returns (allowed, reason).
        """

        # Not a significant decision — Art.22(1) threshold not met
        if decision_type not in self.SIGNIFICANT_DECISION_TYPES:
            return True, "Art.22(1) significance threshold not met"

        # Art.22(4): special category double-lock
        if involves_special_category == SpecialCategoryInvolvement.PRESENT:
            if exception_basis not in (
                DecisionException.CONTRACT_NECESSITY,
                DecisionException.EXPLICIT_CONSENT,
            ):
                return False, "Art.22(4): special category requires contract or explicit consent basis"
            if not self.consent.has_explicit_art9_consent(user_id):
                return False, "Art.22(4): explicit Art.9(2)(a) consent required for special category"

        # Art.22(2) exception must apply
        if exception_basis == DecisionException.LEGAL_AUTHORISATION:
            return True, "Art.22(2)(b): legal authorisation"

        if exception_basis == DecisionException.CONTRACT_NECESSITY:
            # Art.22(3): human review safeguards must be in place (checked structurally)
            return True, "Art.22(2)(a): contract necessity — human review mechanism required"

        if exception_basis == DecisionException.EXPLICIT_CONSENT:
            if not self.consent.has_explicit_art22_consent(user_id):
                return False, "Art.22(2)(c): explicit consent required but not found"
            return True, "Art.22(2)(c): explicit consent confirmed"

        return False, "Art.22(1): no exception applies — automated decision not permitted"

    def record_decision(
        self,
        user_id: str,
        decision_type: str,
        outcome: Any,
        model_version: str,
        input_features: dict,
        exception_basis: DecisionException,
        involves_special_category: SpecialCategoryInvolvement = SpecialCategoryInvolvement.NONE,
    ) -> AutomatedDecision:
        """Record an automated decision with full audit trail."""
        decision = AutomatedDecision(
            decision_id=str(uuid.uuid4()),
            user_id=user_id,
            decision_type=decision_type,
            outcome=outcome,
            model_version=model_version,
            input_features=input_features,
            exception_basis=exception_basis,
            involves_special_category=involves_special_category,
            created_at=datetime.now(timezone.utc),
        )
        self.db.store_decision(decision)
        self._notify_data_subject(user_id, decision)
        logger.info(
            "Automated decision recorded: %s for user %s outcome=%s",
            decision_type, user_id, outcome,
        )
        return decision

    def request_human_review(self, decision_id: str, user_id: str, context: str = "") -> None:
        """Art.22(3): data subject requests human review of automated decision."""
        decision = self.db.get_decision(decision_id)
        if decision.user_id != user_id:
            raise PermissionError("Decision does not belong to this user")
        decision.human_review_requested = True
        self.db.update_decision(decision)
        self.review_queue.enqueue(
            decision_id=decision_id,
            user_id=user_id,
            context=context,
            deadline_days=30,
        )
        self.notify.send_review_acknowledgment(user_id, decision_id)
        logger.info("Human review requested for decision %s by user %s", decision_id, user_id)

    def complete_human_review(
        self,
        decision_id: str,
        reviewer_id: str,
        human_outcome: Any,
        rationale: str,
    ) -> None:
        """Record human review outcome (Art.22(3) safeguard)."""
        decision = self.db.get_decision(decision_id)
        decision.human_review_completed_at = datetime.now(timezone.utc)
        decision.human_reviewer_id = reviewer_id
        decision.human_outcome = human_outcome
        self.db.update_decision(decision)
        self.notify.send_review_outcome(
            user_id=decision.user_id,
            decision_id=decision_id,
            outcome=human_outcome,
            rationale=rationale,
        )
        logger.info(
            "Human review completed for decision %s by reviewer %s outcome=%s",
            decision_id, reviewer_id, human_outcome,
        )

    def _notify_data_subject(self, user_id: str, decision: AutomatedDecision) -> None:
        """Notify data subject of automated decision and their Art.22(3) rights."""
        if decision.exception_basis in (
            DecisionException.CONTRACT_NECESSITY,
            DecisionException.EXPLICIT_CONSENT,
        ):
            self.notify.send_automated_decision_notice(
                user_id=user_id,
                decision_id=decision.decision_id,
                decision_type=decision.decision_type,
                outcome=decision.outcome,
                rights_notice="You have the right to request human review, express your point of view, and contest this decision.",
            )

Art.21 × Art.22 Interaction

Art.21 and Art.22 interact in important ways:

Scenario 1: Objection to profiling that feeds automated decisions

When a data subject objects under Art.21(1) to profiling based on legitimate interests, and that profiling feeds an automated decision pipeline:

  1. Art.21(1) objection → immediate Art.18(1)(d) restriction
  2. Restriction suspends the profiling pipeline
  3. Automated decision pipeline that depends on the suspended profiling must also be suspended
  4. Even if the automated decision itself has an Art.22(2) exception, it cannot proceed on stale or suspended profile data

Scenario 2: Direct marketing profiling

When a data subject objects under Art.21(2) to direct marketing:

Scenario 3: Art.22(1) prohibition and consent

When relying on Art.22(2)(c) explicit consent for automated decisions, and the data subject later withdraws consent under Art.7(3):


Art.22 EDPB Enforcement Cases (2025–2026)

DE-BfDI-2025-10 — €4.1M: Automated Credit Decisions Without Human Review

A German fintech used fully automated credit scoring to approve or reject loan applications. The controller cited Art.22(2)(a) (contract necessity) as the exception basis. BfDI found the system lacked any mechanism for applicants to request human review of rejected decisions. Art.22(3) requires all three safeguards when relying on Art.22(2)(a): human review mechanism, right to express point of view, and right to contest.

Engineering lesson: Art.22(2)(a) is not a free pass for automation. All three Art.22(3) safeguards are mandatory when using the contract-necessity exception.

IE-DPC-2026-02 — €6.3M: Automated Content Moderation Suspensions

An Irish DPC investigation found that a platform's automated content moderation system suspended accounts without any human review mechanism. The platform argued account suspension was not a "legal or similarly significant effect." DPC disagreed: account suspension that prevents access to a commercial service on which the user depends for income constitutes a significantly significant effect under Art.22(1).

Engineering lesson: "Significant effects" is not limited to financial products. Account suspensions, access revocations, and exclusions from services can qualify depending on the impact on the data subject.

FR-CNIL-2025-22 — €2.4M: Recruitment Screening Without Art.22 Disclosure

A French HR SaaS platform used algorithmic screening to eliminate CV applications before any human recruiter review. The controller had not disclosed the use of automated decision-making in its privacy notice and had no mechanism for candidates to learn they had been eliminated by an algorithm, request human review, or contest the outcome. CNIL found violations of Art.22(1) (no exception applied without consent or contract necessity) and Art.13(2)(f) (failure to disclose automated decision-making in privacy notice).

Engineering lesson: Art.22 compliance requires both technical safeguards and privacy notice disclosure. Art.13(2)(f) requires the privacy notice to mention "the existence of automated decision-making, including profiling" and provide "meaningful information about the logic involved."

NL-AP-2025-09 — €1.7M: Health Data in Automated Insurance Pricing

A Dutch insurer used health questionnaire data to feed an automated premium pricing model. The controller argued the processing was based on consent for the insurance contract. AP found Art.22(4) required both an Art.9(2)(a) explicit consent for health data processing and an Art.22(2)(c) explicit consent for the automated decision — presented as two separate consent actions. A single "I agree to the terms" checkbox satisfied neither.

Engineering lesson: Art.22(4) is a double lock. When special category data is involved, you need two separate explicit consents: one under Art.9(2)(a) for the data processing and one under Art.22(2)(c) for the automated decision.


Art.13(2)(f): Privacy Notice Obligation

Art.13 and Art.14 require that the privacy notice disclosed at data collection includes:

Information about the existence of automated decision-making, including profiling, referred to in Art.22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

If you operate automated decision-making that meets the Art.22(1) threshold, your privacy notice must:

  1. State that automated decision-making exists
  2. Describe the logic (not necessarily the full model — but the general approach)
  3. Describe the significance of the decision and its consequences for the data subject
  4. State the data subject's Art.22(3) rights (human review, expression, contest)

Failure to include Art.22 disclosure in the privacy notice is independently actionable — CNIL-2025-22 above imposed a fine for Art.13(2)(f) violation separate from the Art.22 violation.


30-Item GDPR Art.21–22 Compliance Checklist

Art.21: Right to Object

Identification and Communication (Art.21(1–3)):

Processing Gates:

Documentation:

Art.22: Automated Decision-Making

Scope and Exception Basis:

Safeguards (Art.22(3)):

Disclosure and Audit:


Art.21–22 and sota.io

European-hosted infrastructure simplifies Art.21–22 compliance in one specific way: when processing is restricted under Art.18(1)(d) due to a pending Art.21(1) objection, there are no Art.15(2) international transfer disclosures to update. All processing remains within EU jurisdiction — restriction means restriction, without the added complexity of instructing non-EU sub-processors to also suspend processing.

For Art.22 automated decisions involving personal data, EU-native hosting means the data feeding the decision model never leaves EU jurisdiction. This simplifies the Art.32 security obligation and removes the need for Standard Contractual Clauses or Transfer Impact Assessments for the decision pipeline's data inputs.


Key Takeaways for Engineers

Art.21:

Art.22:

See also: