2026-04-18·18 min read·

GDPR Art.9: Special Categories of Personal Data — Prohibition, Ten Exceptions & Explicit Consent (2026)

Post #440 in the sota.io EU Cyber Compliance Series

Art.9 operates as a second-layer gate that sits above Art.6. Where Art.6 asks "is there a lawful basis for this processing?", Art.9 asks a prior question: "is this data in a special category — and if so, does one of the ten narrow exceptions apply?" If the data falls within Art.9(1) and no Art.9(2) exception applies, the processing is unlawful regardless of any Art.6 basis. A controller cannot cure an Art.9 violation by pointing to legitimate interests under Art.6(1)(f) — the prohibition in Art.9(1) is categorical.

This guide is part of the GDPR Chapter I series: Art.1-4 ScopeArt.5 PrinciplesArt.6 Lawful BasesArt.7 Consent ConditionsArt.9 Special Categories (this guide).


The Eight Special Categories

Art.9(1) lists eight categories that receive heightened protection:

CategoryExamplesDeveloper Trigger
Racial or ethnic originPhoto, name-inferred ethnicity, nationality combinationsUser photos, facial analysis, diversity surveys
Political opinionsParty membership, voting intent, political donationsForums, petition apps, polling tools
Religious or philosophical beliefsChurch membership, dietary preferences (halal/kosher), prayer appsScheduling, community apps, dietary tracking
Trade union membershipUnion sign-up, dues records, strike participationHR systems, payroll, workforce management
Genetic dataDNA sequences, hereditary disease markers, ancestryHealth apps, genomics platforms, family tree apps
Biometric data (for unique identification)Fingerprint, facial recognition template, voice print, iris scanAuth systems, attendance, access control
Health dataMedical records, prescriptions, disability status, fitness data, insurance claimsHealth apps, HR sick leave, wearables, URLs with /cancer/ or /hiv/
Sex life or sexual orientationRelationship type, dating app profile, inferred orientationDating apps, household data, inferred from browsing

Critical nuance on biometric data: Art.4(14) defines biometric data as data resulting from specific technical processing relating to physical, physiological or behavioural characteristics which allow or confirm the unique identification of a person. A photo is not automatically biometric data — but a facial recognition template derived from the photo is. Storing raw photos without running identification algorithms does not trigger Art.9; running facial recognition on those photos does.


Art.9(1) — The Absolute Prohibition

Art.9(1) states the prohibition in absolute terms: processing of special category data "shall be prohibited." There is no proportionality balancing, no weighing of interests, no residual legitimate-interest pathway. If you process special category data without an Art.9(2) exception, the processing is unlawful — full stop.

The prohibition covers all Art.4(2) processing operations: collection, recording, organisation, storage, retrieval, consultation, use, disclosure, combination, restriction, and erasure.

Why the Prohibition Exists

Recital 51 explains that special categories warrant stronger protection because processing them may create significant risks to fundamental rights and freedoms. Health data can be used to discriminate in employment, insurance, and housing. Political opinion data enables surveillance and persecution. Genetic data reveals information not just about the data subject but about their entire biological family. Biometric data, once compromised, cannot be reset.

The CJEU's Expansive Interpretation

The CJEU consistently interprets Art.9(1) broadly:

C-184/20 Vyriausioji (2022): A Lithuanian civil servant was required to declare their spouse's professional relationships in an annual declaration. The Court held that this data — which could reveal the data subject's personal relationships — fell within Art.9(1)'s scope for data relating to sex life, even though the immediate subject was professional relationships. The key principle: data capable of revealing special category information is itself special category data, even if it does not directly state the protected characteristic.

C-252/21 Meta Platforms (2023): Facebook's social login feature allowed third-party websites to send data about user activity back to Facebook. This activity data, in combination, could reveal users' sexual orientation. The Court held this constituted processing of Art.9 data because the combination made the sexual orientation inference possible. Meta could not use legitimate interests (Art.6(1)(f)) to overcome the Art.9 prohibition.

C-534/20 Leistritz (2021): An email from an employee to their employer mentioning trade union membership constituted Art.9 data. The scope is not limited to formal union membership registers — any data that reveals trade union membership falls within Art.9(1).


Art.9(2) — The Ten Exceptions

Art.9(2) provides an exhaustive list of situations where the prohibition does not apply. These exceptions are not a general derogation — each has specific conditions that must be met:

Art.9(2)ExceptionKey ConditionMost Common Use
(a)Explicit consentExplicit (not just unambiguous); specific; can be prohibited by Member State lawUser-facing health apps, genetic testing
(b)Employment / social securityAuthorised by Union or Member State law; appropriate safeguardsHR systems, payroll, sick leave
(c)Vital interestsData subject physically/legally incapable of giving consentEmergency medical systems
(d)Foundation/association activitiesLegitimate activities; only members/former members; no disclosure outsideReligious orgs, political parties, trade unions
(e)Manifestly public dataData subject has manifestly made it public themselvesPublic political statements, published memoirs
(f)Legal claimsEstablishment, exercise, or defence of legal claims; judicial proceedingsLitigation support, fraud investigation
(g)Substantial public interestUnion or Member State law; proportionate; essential safeguardsGovernment, fraud prevention, journalism
(h)Preventive/occupational medicineBound by professional secrecy; Art.9(3) conditionsOccupational health, clinical research
(i)Public healthUnion or Member State law; Art.9(3) conditions; professional secrecyEpidemiology, cross-border health threats
(j)Research / statistics / archivingArt.89(1) safeguards; proportionate; anonymisation where possibleAcademic research, clinical trials, archives

The most commonly invoked Art.9(2) exception for commercial applications is explicit consent. But "explicit" consent under Art.9(2)(a) is a higher threshold than the "unambiguous" consent under Art.6(1)(a) and Art.7.

What "Explicit" Means

The EDPB Guidelines 05/2020 on Consent clarify that explicit consent must involve:

  1. An express statement — a mere opt-in tick box for general terms is insufficient; the data subject must make an active, specific statement covering the special category processing
  2. Specificity to the special category — generic consent to "process my data" does not cover health data; the consent must name the category or describe it clearly
  3. Separate from other consents — the Art.9 explicit consent should be presented in a distinct interface element from other consent requests

Member State Override

Art.9(2)(a) includes a significant caveat: explicit consent does not apply if Union or Member State law provides that the Art.9(1) prohibition cannot be lifted by the data subject's consent. Several Member States have used this power:

For B2C applications targeting consumers across all EU Member States, you need a legal assessment per country before relying on Art.9(2)(a).

Practical Implementation

When collecting health data (e.g., a fitness app, a dietary tracking SaaS, an occupational health portal):

class ExplicitConsentRecord:
    def __init__(self):
        self.user_id: str = ""
        self.timestamp_utc: str = ""
        self.ip_address: str = ""
        self.special_category: str = ""        # e.g., "health_data"
        self.specific_purpose: str = ""         # e.g., "personalise_workout_plans"
        self.consent_text_shown: str = ""       # exact text shown to user
        self.consent_text_version: str = ""     # version ID for future proof
        self.mechanism: str = ""                # e.g., "explicit_checkbox_art9"
        self.explicit_statement: str = ""       # e.g., "I explicitly consent to health data processing"
        self.withdrawal_timestamp: str = None

Note the explicit_statement field — this records the user's actual declaration, not just a boolean. The EDPB requires that explicit consent involve an affirmative statement that is specific to the special category processing.


Art.9(2)(b) — Employment Law: The Employer Exception

For HR systems and occupational tools, Art.9(2)(b) is the primary exception. It requires:

  1. The processing must be authorised by Union or Member State law — a general employment contract is not sufficient
  2. The authorising law must provide appropriate safeguards for the data subject's fundamental rights
  3. The purpose must relate to employment, social security, or social protection law

Member State implementations:

CountryKey LawPermitted HR Processing
Germany§22 BDSGSick leave records, occupational health (with occupational physician)
FranceLabour Code L1222-2Health questionnaires for job suitability (strict conditions)
NetherlandsWvGr (Data Protection Act)Absenteeism management (employer cannot access diagnosis)
UK (post-Brexit)DPA 2018 Schedule 1 Part 1Employment, pensions, social protection

Critical limitation: Art.9(2)(b) permits processing the fact of illness for payroll/absence management, but not the diagnosis or medical details. An employer can record "employee absent from 2026-04-14 to 2026-04-18" under Art.9(2)(b), but cannot record "employee absent due to depression" unless there is specific statutory authority.


Art.9(2)(f) permits processing of special category data when necessary for "the establishment, exercise or defence of legal claims or whenever courts are acting in their judicial capacity."

For a SaaS or B2B product, the main uses are:

The key limitation is necessity: the processing must be necessary for the specific legal claim, not merely useful or convenient. If you could establish the same legal position without processing special category data, Art.9(2)(f) does not apply.


Health Data in SaaS: The URL Problem

One of the most overlooked Art.9 compliance traps for SaaS developers is health inference from URL paths and navigation data.

How URLs Become Health Data

Consider a general-purpose platform that hosts medical clinics as customers. A user visiting /clinic/dr-mayer-oncology/book-appointment has effectively disclosed that they are or may be a cancer patient. The URL itself is health data because it reveals a visit to an oncology service.

The same applies to:

Google Analytics enforcement (2022-2023): The Austrian DSB, French CNIL, and Italian Garante all found that using Google Analytics on healthcare-related websites constituted unlawful processing under both Art.9 and Art.44-49 (transfers to the US), because:

  1. The page URLs were health data (Art.9 violation)
  2. The data was transferred to Google/US without an adequate transfer mechanism (Art.44 violation)
  3. No Art.9(2) exception applied because Google Analytics processes health-inferred data for its own advertising purposes

Practical Safeguards for Platforms Serving Healthcare

If your platform serves healthcare clients or hosts any health-related content:

SENSITIVE_PATH_PATTERNS = [
    r"/oncol", r"/cancer", r"/hiv", r"/aids", r"/mental.health",
    r"/psychiatr", r"/fertility", r"/ivf", r"/diabetes",
    r"/addiction", r"/rehab", r"/abortion", r"/contraception",
    r"/std", r"/sti", r"/therapy", r"/counsell",
]

def is_health_sensitive_url(url_path: str) -> bool:
    import re
    return any(re.search(p, url_path, re.IGNORECASE) for p in SENSITIVE_PATH_PATTERNS)

def should_exclude_from_analytics(request_path: str, referrer: str = "") -> bool:
    return is_health_sensitive_url(request_path) or is_health_sensitive_url(referrer)

Apply this filter before sending any data to third-party analytics tools. Health-sensitive URLs should be hashed, truncated, or excluded entirely from analytics pipelines.


Biometric Data: Facial Recognition and Fingerprint Auth

When Biometric Data Is "For Unique Identification"

Art.9 only applies to biometric data "processed for the purpose of uniquely identifying a natural person." This means:

The EDPB Opinion 3/2019 on Processing of Personal Data Through Video Devices notes that facial recognition in public spaces is high-risk processing that typically requires a DPIA under Art.35 and can only be lawful under limited exceptions (law enforcement-specific Member State law, Art.9(2)(g) for some public interest uses).

Biometric Auth in B2B SaaS

Fingerprint and Face ID authentication (e.g., Apple FaceID, Android Biometric) used in a SaaS mobile app is typically not Art.9 processing by the SaaS operator if:

  1. The biometric template is stored only on the user's device (in the Secure Enclave)
  2. The SaaS operator never receives or processes the biometric template
  3. The operator only receives a boolean "biometric match" confirmation from the OS

In this architecture, the OS vendor (Apple/Google) is the processor for the biometric data, operating under their own data processing framework. The SaaS operator processes only a session token, not biometric data. Document this clearly in your ROPA and DPIA.

If your system stores biometric templates server-side for identification (facial recognition at access control, voice authentication over telephone), Art.9 applies and you need an Art.9(2) exception — typically Art.9(2)(a) explicit consent or Art.9(2)(b) employment law for workforce scenarios.


Art.9(2)(g) — Substantial Public Interest: Wide but Controlled

Art.9(2)(g) permits processing special category data for reasons of substantial public interest, but requires:

  1. Union or Member State law that specifically provides for this
  2. The processing must be proportionate to the aim pursued
  3. Essential safeguards must be in place
  4. The data subject's fundamental rights and interests must be respected

Member States have implemented Art.9(2)(g) broadly. UK Schedule 1 DPA 2018 lists over 20 specific public interest conditions. Germany's §22 BDSG lists specific permitted purposes including archiving of public interest, scientific research, and statistical purposes.

For developers building tools for public sector clients or regulated industries, always check the specific Member State implementing law — Art.9(2)(g) is not self-executing.


For research, statistical, and archiving purposes, Art.9(2)(j) operates in conjunction with Art.89. The Art.89 conditions require:

  1. Pseudonymisation as a default measure (where the research purpose can be fulfilled with pseudonymised data)
  2. Data minimisation — only the minimum special category data needed for the research purpose
  3. Technical and organisational measures that ensure the data is not used for decisions about individuals without their consent
  4. Anonymisation at the earliest possible point in the research lifecycle

Key safeguard: Art.9(2)(j) does not permit bypassing Art.6 — you still need an Art.6 basis for the general processing, typically Art.6(1)(e) (public task) or Art.6(1)(f) (legitimate interests) for private research entities. Art.9(2)(j) only addresses the special category layer.


Python SpecialCategoryChecker

A practical tool for developers to audit whether their data models contain special category data fields:

import re
from dataclasses import dataclass, field
from typing import Optional

SPECIAL_CATEGORY_PATTERNS = {
    "health": [
        r"health", r"medical", r"diagnosis", r"disease", r"illness",
        r"prescription", r"medication", r"disability", r"sick", r"bmi",
        r"weight", r"blood_type", r"allerg", r"symptom", r"condition",
        r"mental", r"therapy", r"psycholog", r"psychiatr",
    ],
    "genetic": [
        r"dna", r"genetic", r"genome", r"hereditar", r"ancestry",
        r"chromosome", r"allele", r"snp",
    ],
    "biometric": [
        r"fingerprint", r"face_id", r"facial", r"biometric", r"iris",
        r"voice_print", r"retina", r"palm_print", r"gait",
        r"face_template", r"embedding", r"feature_vector",
    ],
    "racial_ethnic": [
        r"race", r"ethnic", r"nationality_inferred", r"skin_color",
        r"country_of_origin",
    ],
    "political": [
        r"political", r"party_member", r"voting", r"political_opinion",
        r"ideology",
    ],
    "religious": [
        r"religion", r"faith", r"belief", r"church", r"mosque",
        r"synagogue", r"denomination", r"dietary_halal", r"dietary_kosher",
    ],
    "trade_union": [
        r"union_member", r"trade_union", r"union_dues", r"union_id",
        r"collective_bargaining",
    ],
    "sex_life": [
        r"sexual_orientation", r"sex_life", r"partner_gender",
        r"relationship_type", r"dating", r"sexuality",
    ],
}

@dataclass
class FieldAuditResult:
    field_name: str
    category_matches: list[str] = field(default_factory=list)
    risk_level: str = "none"  # none | low | medium | high
    recommendation: str = ""

def check_field_name(field_name: str) -> FieldAuditResult:
    result = FieldAuditResult(field_name=field_name)
    lower = field_name.lower()
    for category, patterns in SPECIAL_CATEGORY_PATTERNS.items():
        if any(re.search(p, lower) for p in patterns):
            result.category_matches.append(category)
    if result.category_matches:
        result.risk_level = "high"
        cats = ", ".join(result.category_matches)
        result.recommendation = (
            f"Field '{field_name}' matches Art.9 category: {cats}. "
            f"Verify Art.9(2) exception applies. Document in ROPA. "
            f"Consider pseudonymisation or separation into high-security store."
        )
    return result

def audit_schema(field_names: list[str]) -> list[FieldAuditResult]:
    return [r for r in (check_field_name(f) for f in field_names) if r.category_matches]

# Example usage
schema_fields = [
    "user_id", "email", "created_at",
    "health_condition", "blood_type", "disability_status",
    "political_party", "union_membership",
    "face_embedding", "fingerprint_template",
    "sexual_orientation", "religion",
]

flagged = audit_schema(schema_fields)
for r in flagged:
    print(f"⚠ {r.field_name}: {r.category_matches}")
    print(f"  → {r.recommendation}\n")

Run this against your database schema, ORM models, and API request/response types at design time to surface Art.9 exposure early in the development cycle.


Art.9 Compliance Checklist

Data Mapping — Identify Special Categories
  ☐ All database tables audited for Art.9 fields (use SpecialCategoryChecker)
  ☐ API endpoints reviewed: does any request/response contain Art.9 data?
  ☐ Analytics pipelines: do URLs or events reveal health/biometric/political data?
  ☐ Logs and monitoring: are special category fields excluded from log aggregators?
  ☐ Third-party integrations: do any processors receive Art.9 data?

Art.9(2) Exception Documentation
  ☐ For each Art.9 data type: which Art.9(2) exception applies? Documented in ROPA.
  ☐ If Art.9(2)(a) explicit consent: separate, specific, Art.9-labelled consent mechanism
  ☐ If Art.9(2)(b) employment: specific Member State law identified; safeguards documented
  ☐ If Art.9(2)(g) public interest: specific Union/Member State law identified; proportionality assessed
  ☐ If Art.9(2)(j) research: Art.89 safeguards (pseudonymisation, minimisation) in place
  ☐ No use of Art.6(1)(f) legitimate interests to override Art.9 — not permitted

Biometric Data
  ☐ If using facial recognition/fingerprint: template stored server-side? → Art.9(2) exception required
  ☐ If using device biometrics (FaceID/Android Biometric): biometric template stays on device? → Art.9 does not apply to SaaS operator
  ☐ Biometric data stored in isolated, access-controlled store
  ☐ Biometric templates not stored longer than necessary for identification purpose

Health Data and SaaS Platforms
  ☐ URL path analytics: health-sensitive paths excluded from third-party analytics
  ☐ Search queries: health-related searches not sent to external analytics
  ☐ Referrer data: health-sensitive referrers stripped before analytics transmission
  ☐ Multi-tenant platforms: tenant namespace isolation prevents cross-tenant inference

Technical Safeguards
  ☐ Special category fields in separate, more-restricted database schema/store
  ☐ Access controls: only personnel with explicit need can access Art.9 fields
  ☐ Encryption at rest and in transit for all Art.9 data (stronger key management)
  ☐ Pseudonymisation applied where research/analytics use case allows
  ☐ Data retention policy stricter for Art.9 data than general personal data

Documentation
  ☐ ROPA includes Art.9 processing operations with Art.9(2) exception citation
  ☐ DPIA conducted for high-risk Art.9 processing (biometric at scale, health analytics, facial recognition)
  ☐ Privacy notice discloses Art.9 processing categories to data subjects (Art.13/14)
  ☐ DPO consulted for novel Art.9 use cases (Art.37-39)

Member State Additions to Art.9

Art.9(4) permits Member States to maintain or introduce further conditions, including limitations, with regard to the processing of genetic, biometric, or health data. Notable examples:

CountryArt.9(4) ImplementationEffect
Germany§26(3) BDSGEmployment Art.9 processing requires collective agreement or individual consent where statute silent
Austria§39 DSGSensitive data processing for direct marketing: prohibited
PolandUstawa o ochronie danychStricter safeguards for genetic data in insurance
FranceArt.9 RGPD + CNIL guidanceBiometric data for authentication in workplaces requires CNIL authorisation
ItalyProvvedimento generale biometricoBiometric workplace authentication requires Garante authorisation

For any deployment involving Art.9 data across EU Member States, a country-by-country legal review is required to identify Art.9(4) restrictions on top of the base Art.9 framework.


What Comes Next in the GDPR Series

Art.9 completes the lawful processing foundation established by Art.6 and Art.7. The downstream articles build on it:

The full series index is at gdpr-compliance-developer-guide-sota-io.


Art.9 defines the boundary between ordinary personal data and data so sensitive that its misuse can determine whether someone is hired, insured, or prosecuted. The prohibition is absolute — the ten exceptions are the entire legal universe in which this data may move. Host with sota.io to process EU user data on EU infrastructure under a single, auditable jurisdiction from day one.