2026-04-16·12 min read·

EDPB-EDPS Joint Opinion 1/2026 on the EU AI Act Digital Omnibus: What Data Protection Authorities Demand — Developer Guide (2026)

The European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) do not often publish joint opinions. When they do, it signals that both bodies consider the legislative changes under review to carry serious implications for fundamental rights — implications significant enough to require a unified institutional response.

In February 2026, the EDPB and EDPS published Joint Opinion 1/2026 specifically addressing the proposed AI Act amendments within the Digital Omnibus package. Their message: they broadly support the simplification agenda, but three specific AI Act modifications raise fundamental rights concerns that must be addressed before the amendments are finalised.

For developers, this Joint Opinion matters in two ways. First, it identifies which AI Act compliance obligations the data protection authorities consider non-negotiable — regardless of what Omnibus simplification proposes. Second, it signals which areas of your compliance programme are most likely to be strengthened, not weakened, in the final Trilogue outcome.

If the EDPB and EDPS recommendations are incorporated into the final Digital Omnibus text, the amendments that initially looked like compliance relief may arrive with new conditions attached.


What the Digital Omnibus Proposes for the AI Act

The Digital Omnibus is the European Commission's regulatory simplification package announced in early 2025. For AI Act purposes, the relevant proposals are:

The first two proposals (deadline extension, SME reduction) are primarily administrative changes. The Joint Opinion does not oppose them. The final three proposals are the focus of Joint Opinion 1/2026.


The Three Core EDPB-EDPS Demands

Demand 1: Art.10(5) — Restrict Sensitive Data for Bias Detection

What Art.10(5) currently says:

Article 10(5) creates a narrow exception to the general prohibition on processing special categories of personal data (racial or ethnic origin, health data, political opinions, biometric data, etc.) under GDPR. Specifically, it allows providers of high-risk AI systems to process these categories to the extent strictly necessary for monitoring, detecting, preventing or addressing biases in the AI system — provided that appropriate safeguards are in place, including technical limitations on re-use, access controls, and notification to supervisory authorities.

This provision recognises a practical reality: you cannot detect demographic bias in a hiring AI system without knowing the demographic attributes of the people it evaluated. Art.10(5) permits that processing — but only for bias detection purposes.

What the Digital Omnibus proposes:

The Omnibus amendments proposed modifications to Art.10(5)'s scope, including adjustments to the qualifying conditions and potentially broadening which entities can invoke the exception. The Commission's intent was to reduce compliance friction for operators deploying bias testing pipelines.

What the EDPB-EDPS demand:

The Joint Opinion opposes any broadening of Art.10(5). Their position: the bias detection exception should be explicitly restricted to circumscribed situations with a strict necessity test that prevents scope creep. Processing special categories for bias detection is appropriate only when:

  1. The bias targeted is specific and documented (not general "fairness" audits)
  2. The processing is limited to what is necessary for that specific bias detection objective
  3. Appropriate safeguards under GDPR Art.9(2)(g) are maintained
  4. The results are not used for any purpose other than bias remediation

What this means for developers:

If the EDPB-EDPS position prevails in Trilogue, Art.10(5) will not be broadened — it may be narrowed. Developers building bias testing pipelines must:

RequirementWhat it means in practice
Specific bias targetingDocument which protected attribute and which AI system output you are testing before processing begins
Necessity scopingProcess only the minimum data volume required — no bulk demographic enrichment
Purpose limitationBias detection data cannot become training data, model monitoring data, or product analytics
Supervisory notificationMaintain records suitable for DPA notification or audit under GDPR Art.30
Safeguard documentationPseudonymisation, access controls, retention limits documented and enforceable

The compliance risk is using Art.10(5) as a general licence to process protected attributes. It is not. It is a narrow exception for documented bias testing with documented safeguards.


Demand 2: Art.51 — Preserve the High-Risk AI System Registration Database

What Art.51 currently requires:

Article 51 establishes an EU-level database of high-risk AI systems — operated by the Commission — where providers of Annex III high-risk AI systems must register their systems before placing them on the market. The registration creates a public record linking the AI system to its conformity assessment, technical documentation, and responsible provider.

The database serves two functions: market surveillance (authorities can identify which high-risk AI systems are in circulation and who is responsible) and transparency (affected persons and researchers can identify which AI systems are deployed in high-risk contexts).

What the Digital Omnibus proposes:

The Omnibus proposed reducing the registration obligation, including the possibility of removing the obligation for certain Annex III categories on the grounds that registration creates disproportionate administrative burden with limited enforcement benefit.

What the EDPB-EDPS demand:

The Joint Opinion explicitly opposes deletion of the registration obligation. Their position: the Art.51 database is a foundational accountability mechanism. Without it:

The EDPB and EDPS go further: they recommend that the database should include information about the DPA notified under Art.10(5) bias detection processing, creating a link between the conformity assessment record and the supervisory authority tracking bias-related data processing.

What this means for developers:

If the EDPB-EDPS position prevails, registration remains mandatory. Developers who were anticipating relief from the registration obligation in the Omnibus should plan for registration to remain in place.

More practically: the registration obligation applies to Annex III high-risk AI systems. If the Omnibus deadline extension passes (moving Annex III compliance from August 2026 to December 2027), registration obligations shift with it. But the obligation itself does not disappear.

Registration requires:

Required registration elementSource
Provider name and contact informationArt.51(1)(a)
AI system name, version, and purposeArt.51(1)(b)
Intended purpose and limitationsArt.51(1)(c)
Conformity assessment procedure usedArt.51(1)(d)
Notified body identification (if applicable)Art.51(1)(e)
Member States where system is placed on marketArt.51(1)(f)
Declaration of Conformity referenceArt.51(1)(g)

Demand 3: Art.57 — Require DPA Involvement in AI Regulatory Sandboxes

What Art.57 currently says:

Article 57 requires every Member State to establish at least one AI regulatory sandbox by 2 August 2026. Sandboxes allow providers to develop, train, test, and validate AI systems in a controlled environment under direct supervision of competent authorities — with reduced regulatory friction in exchange for transparency and authority oversight.

The current text leaves DPA involvement in sandbox supervision to national discretion. National competent authorities run the sandbox; DPAs may be involved when the sandbox includes personal data processing, but this is not mandatory.

What the Digital Omnibus proposes:

The Omnibus did not substantially change the DPA involvement provisions, but also did not mandate it. The EDPB and EDPS consider the current optional status insufficient.

What the EDPB-EDPS demand:

The Joint Opinion recommends making DPA involvement mandatory when AI system development in the sandbox involves processing personal data. Their reasoning: sandbox participants receive regulatory flexibility — including potential deviation from standard GDPR requirements for testing purposes under Art.60. If DPAs are not involved in supervision from the outset, the legal basis for that flexibility is not properly established, and any personal data processing in the sandbox carries uncertain GDPR status.

The specific EDPB-EDPS recommendation: when a sandbox participant's AI system involves personal data processing, the relevant DPA must be formally included in the sandbox supervisory structure — not consulted, but included as a supervisory co-authority.

What this means for developers:

If you are planning to use an AI regulatory sandbox — or are building in a Member State likely to establish one before August 2026 — factor in DPA involvement from the application stage. This means:

  1. DPIA before sandbox entry: Conduct a GDPR Art.35 Data Protection Impact Assessment before submitting your sandbox application when personal data will be processed
  2. DPA notification in the application: Identify the relevant DPA and describe how they will be kept informed or involved
  3. EDPB guidance on sandbox data processing: The EDPB published separate guidance on AI sandbox personal data processing in 2025 — align your sandbox protocol with that guidance

Only Spain's AESIA sandbox is currently operational. Other Member States are building capacity ahead of the August 2026 mandatory deadline. The EDPB recommendation, if adopted, shapes the governance structure every new sandbox must implement.


Joint Opinion 2/2026: The Broader Digital Omnibus Package

Alongside Joint Opinion 1/2026 (AI Act-specific), the EDPB and EDPS also published Joint Opinion 2/2026 addressing the Digital Omnibus provisions affecting the Digital Services Act (DSA) and Digital Markets Act (DMA).

For AI developers, the relevant Joint Opinion 2/2026 elements are:

DSA Art.40 Researcher Data Access: The Digital Omnibus proposed simplifications to the mechanism allowing vetted researchers to access platform data under Art.40 DSA. The EDPB-EDPS position: simplification should not weaken the privacy safeguards around researcher data access, particularly for data sets that include personal data about minors or sensitive categories.

DMA Data Portability: The DMA's data interoperability and portability provisions (Art.6(9), Art.6(10)) interact with GDPR data portability rights. The Joint Opinion 2/2026 recommends that DMA data sharing requirements explicitly incorporate GDPR compatibility as a condition — meaning data sharing mandated under DMA must still comply with GDPR Art.6 lawfulness requirements.

For AI developers using platform data for training or inference: if you receive data through DMA portability mandates, that data carries GDPR obligations — the DMA does not create a separate lawful basis for AI training.


Trilogue Status and What to Plan For

As of April 2026, the Digital Omnibus is in Trilogue — the three-way negotiation between the European Parliament, Council, and Commission. The deadline pressure is real: the current Annex III compliance date is 2 August 2026, approximately 16 weeks away. If Trilogue does not conclude before then, the existing text applies.

Three scenarios for your compliance planning:

ScenarioProbabilityCompliance implication
Omnibus passes with EDPB recommendationsMost likely outcome if Trilogue succeedsArt.10(5) narrowed, registration retained, sandbox DPA required
Omnibus passes without EDPB recommendationsPossible if political pressure overrides DPA positionsDeadline extension passes, bias testing broadened
Omnibus does not pass before August 2026Possible given Trilogue timelineCurrent text applies — August 2026 deadline stands

The EDPB-EDPS Joint Opinion creates political cover for Parliament to demand stronger safeguards in exchange for accepting the Commission's simplification proposals. This is the standard Trilogue dynamic: one institution accepts something it dislikes if another institution strengthens something it cares about.

The practical result: even if the Omnibus passes with deadline extensions, it is likely to come with at least some of the EDPB-EDPS conditions attached. Plan for a deadline-extended but safeguard-intact outcome as the most probable scenario.


Python DPAComplianceTracker

from dataclasses import dataclass, field
from enum import Enum
from typing import Optional
import datetime


class BiasDetectionScope(Enum):
    """Art.10(5) bias detection processing scope classification."""
    SPECIFIC_PROTECTED_ATTRIBUTE = "specific_protected_attribute"  # PERMITTED: documented attribute
    GENERAL_FAIRNESS_AUDIT = "general_fairness_audit"              # RISKY: too broad
    DEMOGRAPHIC_ENRICHMENT = "demographic_enrichment"               # PROHIBITED: not bias detection


class SandboxDPAStatus(Enum):
    """DPA involvement status in AI regulatory sandbox."""
    NOT_APPLICABLE = "not_applicable"           # No personal data processing
    DPA_INCLUDED = "dpa_included"               # DPA formally in supervisory structure
    DPA_NOTIFIED = "dpa_notified"               # DPA notified but not supervisor
    DPA_NOT_ENGAGED = "dpa_not_engaged"         # RISKY under EDPB Joint Opinion 1/2026


@dataclass
class BiasDetectionRecord:
    """
    Art.10(5) bias detection processing record.
    Required under EDPB Joint Opinion 1/2026 circumscribed-situation requirement.
    """
    ai_system_id: str
    protected_attribute: str            # e.g., "age", "ethnic_origin", "gender"
    bias_targeted: str                  # specific bias being tested
    legal_basis: str                    # Art.9(2) GDPR exception invoked
    processing_scope: BiasDetectionScope
    necessity_justification: str        # why this attribute is necessary
    safeguards: list[str]               # pseudonymisation, access controls, retention
    purpose_limitation_confirmed: bool  # data not reused for training/analytics
    dpa_notification_reference: Optional[str] = None  # Art.30 record reference
    created_date: str = field(default_factory=lambda: datetime.date.today().isoformat())

    def is_art10_5_compliant(self) -> bool:
        """Check if processing meets EDPB Joint Opinion 1/2026 circumscribed-situation test."""
        return (
            self.processing_scope == BiasDetectionScope.SPECIFIC_PROTECTED_ATTRIBUTE
            and self.purpose_limitation_confirmed
            and len(self.safeguards) >= 3
            and bool(self.necessity_justification)
        )

    def compliance_gap(self) -> list[str]:
        gaps = []
        if self.processing_scope != BiasDetectionScope.SPECIFIC_PROTECTED_ATTRIBUTE:
            gaps.append(f"Scope too broad: {self.processing_scope.value}. Must target specific attribute.")
        if not self.purpose_limitation_confirmed:
            gaps.append("Purpose limitation not confirmed: bias data may not be reused.")
        if len(self.safeguards) < 3:
            gaps.append(f"Insufficient safeguards: {len(self.safeguards)} documented, minimum 3 required.")
        if not self.necessity_justification:
            gaps.append("Necessity justification missing: required for circumscribed-situation test.")
        return gaps


@dataclass
class HighRiskRegistrationRecord:
    """
    Art.51 EU database registration record.
    Preserve under EDPB Joint Opinion 1/2026 — registration obligation must be retained.
    """
    provider_name: str
    provider_contact: str
    system_name: str
    system_version: str
    intended_purpose: str
    annex_iii_category: str             # e.g., "4(a) employment", "5(b) education"
    conformity_assessment_procedure: str
    notified_body_id: Optional[str]
    member_states: list[str]
    doc_of_conformity_reference: str
    bias_detection_dpa_notified: Optional[str] = None  # EDPB recommendation: link to DPA

    def is_registration_complete(self) -> bool:
        """Check all Art.51 mandatory fields are populated."""
        required = [
            self.provider_name, self.provider_contact, self.system_name,
            self.system_version, self.intended_purpose, self.annex_iii_category,
            self.conformity_assessment_procedure, self.doc_of_conformity_reference
        ]
        return all(bool(f) for f in required) and bool(self.member_states)


@dataclass
class SandboxApplication:
    """
    Art.57 AI regulatory sandbox application.
    DPA involvement required under EDPB Joint Opinion 1/2026 when personal data processed.
    """
    applicant: str
    system_description: str
    processing_personal_data: bool
    dpa_status: SandboxDPAStatus
    dpia_completed: bool
    dpa_reference: Optional[str] = None  # DPA supervisory inclusion reference

    def is_edpb_compliant(self) -> bool:
        """Check if sandbox application meets Joint Opinion 1/2026 DPA requirement."""
        if not self.processing_personal_data:
            return True  # No personal data = DPA not required
        return (
            self.dpa_status == SandboxDPAStatus.DPA_INCLUDED
            and self.dpia_completed
        )

    def compliance_gap(self) -> list[str]:
        gaps = []
        if self.processing_personal_data:
            if self.dpa_status != SandboxDPAStatus.DPA_INCLUDED:
                gaps.append(
                    f"DPA not included in supervisory structure (current: {self.dpa_status.value}). "
                    "EDPB Joint Opinion 1/2026 requires DPA inclusion when personal data processed."
                )
            if not self.dpia_completed:
                gaps.append("DPIA not completed before sandbox entry. Required under GDPR Art.35.")
        return gaps


class DPAComplianceTracker:
    """
    Tracks EDPB Joint Opinion 1/2026 alignment across three dimensions:
    Art.10(5) bias detection, Art.51 registration, Art.57 sandbox.
    """

    def __init__(self):
        self.bias_records: list[BiasDetectionRecord] = []
        self.registrations: list[HighRiskRegistrationRecord] = []
        self.sandbox_applications: list[SandboxApplication] = []

    def add_bias_detection_record(self, record: BiasDetectionRecord) -> None:
        self.bias_records.append(record)

    def add_registration(self, record: HighRiskRegistrationRecord) -> None:
        self.registrations.append(record)

    def add_sandbox_application(self, app: SandboxApplication) -> None:
        self.sandbox_applications.append(app)

    def assess_compliance(self) -> dict:
        """Generate EDPB Joint Opinion 1/2026 compliance assessment."""
        bias_gaps = []
        for r in self.bias_records:
            if not r.is_art10_5_compliant():
                bias_gaps.extend([f"[{r.ai_system_id}] {g}" for g in r.compliance_gap()])

        reg_gaps = []
        for r in self.registrations:
            if not r.is_registration_complete():
                reg_gaps.append(f"[{r.system_name}] Registration incomplete")

        sandbox_gaps = []
        for a in self.sandbox_applications:
            if not a.is_edpb_compliant():
                sandbox_gaps.extend([f"[{a.applicant}] {g}" for g in a.compliance_gap()])

        total_gaps = len(bias_gaps) + len(reg_gaps) + len(sandbox_gaps)
        return {
            "overall_status": "COMPLIANT" if total_gaps == 0 else "GAPS_FOUND",
            "total_gaps": total_gaps,
            "art10_5_bias_gaps": bias_gaps,
            "art51_registration_gaps": reg_gaps,
            "art57_sandbox_gaps": sandbox_gaps,
            "assessed_date": datetime.date.today().isoformat(),
        }


# Example usage
if __name__ == "__main__":
    tracker = DPAComplianceTracker()

    # Art.10(5): bias detection in hiring AI
    tracker.add_bias_detection_record(BiasDetectionRecord(
        ai_system_id="hiring-screen-v2",
        protected_attribute="ethnic_origin",
        bias_targeted="proxy discrimination in CV ranking via name origin",
        legal_basis="GDPR Art.9(2)(g) substantial public interest + AI Act Art.10(5)",
        processing_scope=BiasDetectionScope.SPECIFIC_PROTECTED_ATTRIBUTE,
        necessity_justification="Name-origin proxy bias cannot be detected without ethnic_origin ground truth",
        safeguards=["pseudonymisation before processing", "access restricted to bias team", "30-day retention limit", "no export to training data"],
        purpose_limitation_confirmed=True,
        dpa_notification_reference="ROPA-2026-BIAS-001",
    ))

    # Art.51: registration record
    tracker.add_registration(HighRiskRegistrationRecord(
        provider_name="Acme AI GmbH",
        provider_contact="compliance@acme.ai",
        system_name="HireScreen Pro",
        system_version="2.0.1",
        intended_purpose="Automated CV screening for employment decisions",
        annex_iii_category="4(a) employment — recruitment and selection",
        conformity_assessment_procedure="Art.43(2) internal control",
        notified_body_id=None,
        member_states=["DE", "AT", "NL"],
        doc_of_conformity_reference="DoC-2026-HSP-001",
    ))

    # Art.57: sandbox application with personal data
    tracker.add_sandbox_application(SandboxApplication(
        applicant="Acme AI GmbH",
        system_description="Testing adversarial robustness in hiring context",
        processing_personal_data=True,
        dpa_status=SandboxDPAStatus.DPA_INCLUDED,
        dpia_completed=True,
        dpa_reference="BfDI-SANDBOX-2026-047",
    ))

    result = tracker.assess_compliance()
    print(f"Status: {result['overall_status']}")
    print(f"Total gaps: {result['total_gaps']}")
    for gap in result['art10_5_bias_gaps']:
        print(f"  Bias gap: {gap}")

EDPB vs Developer: The Jurisdictional Dimension

The Joint Opinion has a specific relevance for developers using US-hosted infrastructure.

Art.10(5) and CLOUD Act exposure: When bias detection processing involves special categories of personal data, storing that data on US-provider infrastructure creates CLOUD Act jurisdiction exposure. A US authority could compel the infrastructure provider to disclose the protected attribute data used for bias testing — potentially without the developer's knowledge or consent. The EDPB's recommendation to restrict Art.10(5) to circumscribed situations with documented safeguards implicitly raises the bar for what counts as adequate protection — and US-provider storage is difficult to reconcile with the "appropriate safeguards" requirement under an EDPB-tightened Art.10(5).

Art.51 database and data minimisation: The EU high-risk AI registration database is operated by the European Commission on EU infrastructure. Registration data — provider names, conformity assessment references, intended purposes — must be publicly accessible. This is not sensitive personal data, but it creates a permanent public record linking your company to the high-risk AI systems you develop. That record is EU-jurisdiction, not subject to CLOUD Act compelled disclosure.

Sandbox and DPA involvement: If your sandbox processes personal data on infrastructure subject to non-EU jurisdiction, DPA involvement in sandbox supervision means the DPA will see your infrastructure choices. The EDPB has been consistently critical of sandbox personal data processing on non-EU infrastructure when adequate alternatives exist.


25-Item EDPB Joint Opinion 1/2026 Alignment Checklist

Art.10(5) Bias Detection (items 1-10)

Art.51 Registration (items 11-18)

Art.57 Sandbox (items 19-25)


What to Watch: Trilogue Outcome Timeline

The Digital Omnibus Trilogue timeline is compressed. Key dates:

DateEvent
February 2026EDPB-EDPS Joint Opinion 1/2026 published
April 2026Trilogue negotiations ongoing
2 August 2026Current Annex III compliance deadline
Q2–Q3 2026Expected Trilogue conclusion (before or after August deadline)
December 2027Proposed extended Annex III deadline (if Omnibus passes)

If Trilogue concludes before August 2026 and the EDPB recommendations are incorporated, you will have the deadline extension but with:

If Trilogue concludes after August 2026, the August deadline applies in the interim — then the extension kicks in retroactively for systems not yet compliant.

The prudent approach: continue compliance work on the August 2026 timeline. If the extension passes, you will be ahead. If it does not pass, you will not be caught short. And in either case, plan for the EDPB conditions as the likely outcome.


Summary

The EDPB-EDPS Joint Opinion 1/2026 on the Digital Omnibus is a fundamental rights intervention in the simplification debate. The data protection authorities are not opposing simplification — they are drawing three lines:

  1. Art.10(5) bias detection data must remain restricted to specific, documented, safeguarded situations. Not a general licence to process protected attributes.
  2. Art.51 high-risk AI registration must be preserved. Accountability requires a public record of high-risk AI systems in circulation.
  3. Art.57 sandbox DPA involvement must be mandatory when personal data is processed. Not optional.

These three positions are likely to appear in the final Trilogue text. Build your compliance programme around them, not around a maximally simplified reading of the Omnibus proposals.