2026-04-28·14 min read·sota.io team

EU AI Act Omnibus: New Deadlines, Same CLOUD Act Problem

Political agreement on the EU AI Act Digital Omnibus is expected April 28, 2026. Both the European Parliament and the Council of the EU have converged on their positions, and the trilogue negotiations — the final institutional negotiation before a consolidated text — are heading toward a deal that would push the most significant compliance obligations by 16 to 24 months.

That sounds like relief. For many developers and compliance teams who have been watching the August 2, 2026 deadline approach with mounting anxiety, a one-year or two-year extension feels like a window to breathe. It is — but the relief is narrower than the headlines suggest.

The Omnibus moves compliance dates. It does not move your infrastructure jurisdiction. And for developers who have been making architecture decisions under the assumption that they can "sort out the EU law stuff later," later is still coming — just on a revised schedule.

This guide covers what the Omnibus actually changes, what stays fixed, and why the infrastructure decision you make now matters regardless of which deadline applies to your system.

What the Digital Omnibus Package Contains

The AI Omnibus is part of the Commission's broader Digital Omnibus Package, introduced November 19, 2025, alongside proposed amendments to the Data Act, the CRA, DORA, and NIS2. For the AI Act specifically, the Omnibus proposes targeted changes in three areas: timeline extensions, compliance burden reductions for certain providers, and one new prohibition.

New Hard Deadlines

The original AI Act applied obligations for high-risk AI systems from August 2, 2026. Both the Council and the Parliament have converged on replacing this date with two fixed postponements:

The Commission's original proposal used a conditional mechanism tied to delegated acts — compliance would be required once the Commission issued implementing regulations under specific articles. Both the Parliament and the Council rejected this approach as too uncertain, substituting fixed hard dates instead. This convergence is one of the cleaner areas of agreement in the trilogue.

For Art.50 transparency and watermarking obligations — requirements for AI systems to disclose AI-generated content — the institutions diverge slightly: the Council proposes extending the compliance deadline to February 2, 2027; the Parliament to November 2, 2026. The final text will land somewhere in this range.

What Annex III Actually Covers

Annex III defines eleven categories of high-risk AI systems. Understanding which category applies to your system determines whether the December 2027 or August 2028 date applies.

Annex III applies to standalone systems — software that is itself the AI system, sold or deployed separately from a regulated product. This covers:

  1. Biometric identification and categorisation systems
  2. Critical infrastructure management and operation (water, gas, electricity, traffic)
  3. Education and vocational training (access, assessment, evaluation decisions)
  4. Employment and recruitment (CV screening, interview video analysis, promotion decisions)
  5. Access to essential private services (creditworthiness assessment, insurance risk pricing)
  6. Law enforcement (crime analytics, polygraph testing, deepfake detection for evidence)
  7. Migration and border management (risk assessment, asylum case processing)
  8. Administration of justice and democratic processes (dispute resolution, election influence)

Annex I applies to AI embedded in regulated products already subject to EU product safety legislation — medical devices (MDR), machinery (Machinery Directive 2006/42/EC), toys (Toy Safety Directive), radio equipment (RED 2014/53/EU), civil aviation (EASA regulations), marine equipment (MED 2014/90/EU), and others. The two-year extension (to August 2028) for this category reflects the additional complexity of coordinating AI Act obligations with existing product certification regimes.

If your system is Annex III standalone, your new target date is December 2, 2027. If it is Annex I embedded, August 2, 2028.

The New Art.5 Prohibition

Beyond timeline adjustments, both the Parliament and Council introduce a new prohibition under Article 5 of the AI Act. The Commission's original Omnibus proposal did not include this. The addition — proposed by MEPs — bans AI systems capable of generating or manipulating realistic sexually explicit or intimate images depicting identifiable real persons without their consent. These systems, sometimes called "nudifiers" or "deepfake generators," would be prohibited AI systems subject to Art.99 fines: up to €35 million or 7% of global annual turnover, whichever is higher.

This prohibition enters into force upon application, not subject to the December 2027 or August 2028 extensions. If you operate or deploy any system that could be characterized under this prohibition, the immediate applicability timeline is relevant regardless of the broader Annex III extensions.

What the Omnibus Does Not Change

The Omnibus extension applies specifically to the obligations triggered for high-risk AI systems under Art.6 (Annex III and Annex I classifications). Several other parts of the AI Act are already in force and remain in force:

The Omnibus is a relief valve for Annex III and Annex I high-risk AI providers. It is not a general extension of the AI Act.

What Hasn't Changed: Infrastructure Jurisdiction

Here is the part that most Omnibus coverage will miss.

The AI Act compliance dates moved. The CLOUD Act did not.

The Clarifying Lawful Overseas Use of Data Act (18 U.S.C. § 2703) allows the US government to compel cloud providers that are US legal entities to produce stored data regardless of where the data is physically located. This has been true since 2018. It will remain true after April 28, 2026. It will be true in December 2027 when Annex III obligations kick in, and in August 2028 when Annex I obligations follow.

For developers building high-risk AI systems on AWS, Google Cloud, Azure, Railway, Render, or Fly.io — providers with US parent entities — the following is structurally true regardless of the Omnibus:

Your training data, your logs, your model weights, your inference outputs, and your configuration data are subject to US government compelled access without notification. CLOUD Act § 2703(d) orders can include non-disclosure provisions — the cloud provider cannot tell you that your data has been disclosed.

This matters under the AI Act for three reasons that no timeline extension addresses.

Reason 1: Art.12 Logging Obligations

Art.12 requires providers of high-risk AI systems to build automatic event recording into the system. Logs must be accessible to EU Market Surveillance Authorities (MSAs) for enforcement purposes under Art.12(4). The Art.12 obligation takes effect December 2027 for Annex III systems.

But your logging infrastructure choice — EU-only or CLOUD Act-exposed — is a decision you are making now. If you build on AWS Frankfurt today, you will still be on AWS Frankfurt in December 2027 when the obligation kicks in. At that point, your Art.12 logs — records of every AI system session, every human oversight override, every input dataset reference — are simultaneously accessible to EU MSAs (as required) and subject to US § 2703 compelled disclosure (as a structural consequence of your infrastructure choice).

That is a compliance conflict that no extension resolves.

Reason 2: Art.10 Training Data Governance

Art.10 requires providers to establish data governance practices for training datasets used in high-risk AI systems, including understanding and managing bias risks, data sources, and data collection procedures. Art.10 applies to providers from the moment they begin building systems that will be classified as high-risk.

Training data stored on CLOUD Act-exposed infrastructure is subject to compelled disclosure. If your training data includes personal data of EU residents — which it does if you trained on user behavior, employment records, medical data, or credit information — you have a GDPR Art.44 cross-border transfer problem from the moment the data was transferred to US-controlled infrastructure, and a CLOUD Act compelled disclosure risk from the day you stored it there.

The Omnibus does not extend Art.10's requirements. It does not resolve the training data jurisdiction question. It simply gives you more time to deploy — but the data governance problem is present in your development pipeline now.

Reason 3: GDPR Remains Unaffected

The AI Act Omnibus is an AI Act instrument. It has no effect on the GDPR, and the GDPR does not have a "high-risk AI" grace period. EU AI Act high-risk AI systems processing personal data are simultaneously GDPR-regulated systems. The GDPR Art.44 prohibition on transfers of personal data to third countries without adequate safeguards applies today, not in December 2027.

Processing personal data of EU residents on AWS, GCP, or Azure — providers subject to CLOUD Act — creates a GDPR Art.44 cross-border transfer that requires either Standard Contractual Clauses supplemented by transfer impact assessments or reliance on the EU-US Data Privacy Framework (for US entities that have self-certified). Both of these mechanisms have been challenged and remain legally contested following the Schrems II decision and subsequent CJEU scrutiny. The structural tension between US surveillance law and EU data protection law has not been resolved by any legislative act.

The Architecture Decision Window You Actually Have

Here is what the Omnibus extension genuinely gives you: more time to make the right infrastructure choice before obligations are formally enforced.

If you are building a high-risk AI system today — credit scoring, HR automation, medical diagnosis support, law enforcement analytics — and you are currently running development and staging on AWS or a managed PaaS built on US infrastructure, the December 2027 Annex III deadline gives you roughly 18 months from now to:

  1. Complete your development cycle on current infrastructure
  2. Conduct a formal infrastructure jurisdiction assessment under Art.10 data governance requirements
  3. Plan and execute a migration to EU-native infrastructure before December 2027

That is a realistic window — if you start the assessment now. Infrastructure migrations for AI systems are not weekend projects. Training data pipelines, vector databases, model serving infrastructure, logging backends, and CI/CD systems all carry infrastructure jurisdiction dependencies. Identifying and replacing each layer takes months of planning and testing even for teams with dedicated infrastructure engineers.

Teams that treat the December 2027 deadline as an occasion to start thinking about this problem in late 2027 will discover that an Art.12-compliant, GDPR-sound, CLOUD Act-free infrastructure is a six-to-twelve-month migration project.

What Changes in 2027 That Justifies the Urgency

The December 2, 2027 date triggers obligations that are not just technical checkboxes. Under the AI Act, Annex III high-risk AI providers face:

Art.9 Risk Management: A documented risk management system throughout the system's lifecycle, including identification and analysis of known and foreseeable risks, adoption of risk management measures, and residual risk evaluation. This must be set up and operational at deployment, not assembled retroactively.

Art.10 Data Governance: Training data documentation, bias monitoring, data source assessment. These require auditable provenance from training through deployment.

Art.11 Technical Documentation: A complete technical file (described in Annex IV) that must exist before the system is placed on the market. Annex IV requires, among other items, identification of infrastructure and computing resources used, description of data storage and security measures, and evidence of Art.12 logging capability.

Art.12 Automatic Event Recording: Built into the system at deployment. Logs must be retained and accessible to MSAs. Infrastructure jurisdiction determines whether these logs can be protected from third-sovereign access.

Art.13-14 Transparency and Human Oversight: Information provision to deployers and users; human oversight mechanisms built into the system.

Art.15 Accuracy, Robustness, Cybersecurity: Performance metrics documented, adversarial robustness addressed, cybersecurity measures aligned with state of the art.

Art.43 Conformity Assessment: For most Annex III systems, self-assessment under Art.43 internal control procedure. For Class II systems (HSM, TPM, smart cards, critical infrastructure network equipment — see CRA Art.43-50), third-party assessment by a Notified Body is required.

EU DoC and CE Marking: Declaration of Conformity signed by the provider before market placement.

These are not obligations you assemble in the weeks before your December 2, 2027 compliance date. They require months of systematic preparation, internal documentation, and infrastructure verification.

A Developer's Decision Framework

class AIActOmnibusDecisionFramework:
    """
    AI Omnibus compliance decision framework for high-risk AI developers.
    Inputs: system type, infrastructure, personal data scope.
    Outputs: relevant obligations, urgency level, infrastructure recommendation.
    """

    ANNEX_III_CATEGORIES = {
        "biometric": "Annex III no.1 — standalone biometric ID/categorisation",
        "critical_infra": "Annex III no.2 — critical infrastructure management",
        "education": "Annex III no.3 — education/vocational training access decisions",
        "employment": "Annex III no.4 — employment/recruitment/promotion AI",
        "essential_services": "Annex III no.5 — creditworthiness/insurance risk assessment",
        "law_enforcement": "Annex III no.6 — crime analytics/polygraph/deepfake detection",
        "migration": "Annex III no.7 — risk assessment, asylum processing",
        "justice": "Annex III no.8 — dispute resolution/election influence AI",
    }

    ANNEX_I_SECTORS = [
        "medical_device", "machinery", "toys", "radio_equipment",
        "civil_aviation", "marine_equipment", "railway", "motor_vehicle",
    ]

    US_PARENT_PROVIDERS = [
        "AWS", "GCP", "Azure", "Railway", "Render", "Fly.io",
        "Vercel", "Heroku", "Netlify", "Cloudflare Workers",
    ]

    def assess_system(
        self,
        system_type: str,       # "annex_iii" | "annex_i" | "gpai" | "general"
        category: str,          # key from ANNEX_III_CATEGORIES or ANNEX_I_SECTORS
        infrastructure: str,    # provider name
        processes_personal_data: bool,
        training_data_eu_residents: bool,
    ) -> dict:
        result = {
            "system_type": system_type,
            "category": category,
            "infrastructure_jurisdiction": self._assess_jurisdiction(infrastructure),
            "obligations": [],
            "deadlines": {},
            "cloud_act_risk": infrastructure in self.US_PARENT_PROVIDERS,
            "recommended_action": "",
        }

        # Immediately applicable regardless of Omnibus
        if processes_personal_data:
            result["obligations"].append("GDPR Art.44 — data transfer mechanism required NOW")
            result["deadlines"]["gdpr_art44"] = "IN FORCE — no extension"

        # Art.5 prohibited practices (since Feb 2025)
        result["obligations"].append(
            "Art.5 prohibited practices check required — in force since 2025-02-02"
        )

        if system_type == "annex_iii":
            result["deadlines"]["high_risk_obligations"] = "2027-12-02"
            result["obligations"].extend([
                "Art.9 Risk Management System — operational by 2027-12-02",
                "Art.10 Data Governance — training data documentation required",
                "Art.11 Technical Documentation (Annex IV) — complete before market placement",
                "Art.12 Automatic Event Logging — built into system at deployment",
                "Art.13-14 Transparency + Human Oversight",
                "Art.43 Conformity Assessment — self-assessment or Notified Body",
                "EU DoC + CE Marking — before market placement",
            ])
            result["preparation_start_recommendation"] = "2026-Q3 — 18 months to December 2027"

        elif system_type == "annex_i":
            result["deadlines"]["high_risk_obligations"] = "2028-08-02"
            result["obligations"].append("Coordinated conformity assessment with sector regulation")
            result["preparation_start_recommendation"] = "2026-Q4 — 24 months to August 2028"

        elif system_type == "gpai":
            result["deadlines"]["gpai_obligations"] = "2026-08-02 — NOT extended by Omnibus"
            result["obligations"].extend([
                "Art.51-56 GPAI systemic risk provisions",
                "Art.13-15 GPAI transparency",
            ])

        # Infrastructure recommendation
        if result["cloud_act_risk"]:
            result["cloud_act_exposure"] = {
                "provider": infrastructure,
                "cloud_act_section": "18 U.S.C. § 2703",
                "risk": "Compelled disclosure of stored data regardless of server location",
                "non_disclosure_risk": "~60% of CLOUD Act orders include gag provisions (EFF 2024)",
                "art_12_conflict": "Logs accessible to EU MSAs AND subject to US § 2703 orders",
                "gdpr_art44_conflict": "Transfer mechanism required; Schrems II legal uncertainty",
            }
            result["recommended_action"] = (
                f"High CLOUD Act exposure on {infrastructure}. "
                "Infrastructure migration to EU-native provider recommended before "
                f"{result['deadlines'].get('high_risk_obligations', 'August 2026')}. "
                "Migration planning should begin now — typical timeline: 6-12 months."
            )
        else:
            result["recommended_action"] = (
                "EU-native infrastructure — no CLOUD Act exposure. "
                "Focus on Art.9/10/11/12/43 compliance documentation."
            )

        return result

    def _assess_jurisdiction(self, provider: str) -> str:
        if provider in self.US_PARENT_PROVIDERS:
            return "US — CLOUD Act exposed (18 U.S.C. § 2703)"
        eu_native = ["Hetzner", "OVHcloud", "Scaleway", "sota.io", "Clever Cloud", "Scalingo"]
        if provider in eu_native:
            return "EU — no US parent, CLOUD Act-free"
        return "Unknown — jurisdiction assessment required"


# Example assessment: credit scoring AI on Railway
framework = AIActOmnibusDecisionFramework()
result = framework.assess_system(
    system_type="annex_iii",
    category="essential_services",    # Annex III no.5(b) creditworthiness assessment
    infrastructure="Railway",
    processes_personal_data=True,
    training_data_eu_residents=True,
)

print(f"Compliance deadline: {result['deadlines']['high_risk_obligations']}")
# 2027-12-02

print(f"CLOUD Act risk: {result['cloud_act_risk']}")
# True

print(f"Infrastructure jurisdiction: {result['infrastructure_jurisdiction']}")
# US — CLOUD Act exposed (18 U.S.C. § 2703)

print(f"Recommended action: {result['recommended_action']}")
# High CLOUD Act exposure on Railway. Infrastructure migration recommended before 2027-12-02.

The Timeline That Actually Matters

The AI Omnibus gives you an 18-month extension on Annex III obligations. Here is how to use it:

Now through Q2 2026 — Classify and assess:

Q3 2026 — Infrastructure decision:

2026-2027 — Build and document:

Q3 2027 — Conformity assessment:

December 2, 2027 — Obligations applicable:

Teams that start Q3 2027 will not make this timeline.

What the Omnibus Does Not Change for Infrastructure Choices

The question developers frequently ask after reading Omnibus coverage is: "If the deadline moved to 2027, can I stay on AWS/Railway/Render for now and migrate later?"

The answer depends on what "for now" means:

GDPR personal data processing: You cannot wait until 2027 to address GDPR Art.44 compliance if you are processing EU resident personal data on CLOUD Act-exposed infrastructure today. The GDPR has no AI Act extension.

Training data governance: If your training pipeline stores EU personal data on US-parent infrastructure, Art.10 documentation requirements will need to account for this — and your transfer mechanism documentation needs to be in place before the data was processed, not after the AI Act deadline.

Infrastructure migration lead time: The December 2027 deadline is 18 months away. Infrastructure migrations for AI systems — especially those with distributed training, model artifact storage, vector databases, and logging pipelines — routinely take 9-18 months from decision to completion. Waiting until Q2 2027 to start a migration is waiting until there is no time left.

Competitive positioning: Organizations that deploy on EU-native infrastructure from the start of their development cycle arrive at the 2027 compliance date with no infrastructure migration debt. Organizations that make the AWS-now-migrate-later bet arrive needing to simultaneously complete compliance documentation and execute a major infrastructure change under deadline pressure.

Practical Checklist: What to Do Today

The AI Omnibus gives you 18 months. Use the first 90 days to establish your compliance posture.

Classification (30 days):

Jurisdiction (30 days):

Governance (60 days):

Infrastructure decision (90 days):

What Changes at the Infrastructure Layer

Resolving the CLOUD Act conflict requires infrastructure where no US legal entity holds custody of your data. "EU servers" is a necessary but insufficient condition — the parent company's legal domicile determines CLOUD Act applicability, not the server rack's location.

For Annex III high-risk AI systems, the infrastructure stack that is Art.12-compliant by December 2027 typically includes:

Managed PaaS that satisfies this stack — where you deploy your containerized AI system and all the above layers are handled under EU jurisdiction — removes the infrastructure migration problem entirely. You deploy once to EU-native infrastructure, and the Art.12, GDPR Art.44, and CLOUD Act questions are resolved structurally rather than managed as ongoing compliance risks.

See Also