2026-04-26·14 min read·

August 2, 2026 is 98 days away. On that date, the EU AI Act applies in full to high-risk AI systems — and the transitional period ends for most categories. Article 103 governs who gets extra time, who doesn't, and what "full application" actually means for developers shipping AI products in the EU.

The core question is not whether your AI system is high-risk. It's whether you placed it on the market before Aug 2026 — and whether you've made a substantial modification since then. Get these two answers wrong and you'll face conformity assessment, QMS, technical documentation, and post-market monitoring obligations with no transition window left.

What Art.103 Actually Says

Article 103 — Transitional Provisions (paraphrased from Regulation 2024/1689)

High-risk AI systems referred to in Annex III that have been placed on the market or put into service before the date of application of this Regulation shall comply with the requirements of this Regulation by [24-month period], unless they undergo a substantial modification. High-risk AI systems that are components of large-scale IT systems referred to in Annex I shall comply within [36-month extended period]. Providers and deployers shall take the necessary steps to ensure compliance with the applicable requirements and obligations.

Three practical implications for developers:

  1. 24-month transition: Most Annex III high-risk AI systems placed on market before Aug 2026 had until Aug 2026 to achieve full compliance — meaning that window closes in 98 days.
  2. Substantial modification = immediate obligation: If you update your AI system in a way that constitutes a "substantial modification," transitional protection disappears and Art.6-15 apply immediately.
  3. Annex I exception: AI components in EU large-scale IT systems (Schengen, VIS, EES, ETIAS, Eurodac, etc.) get a longer runway — 36 months from entry into force.

The EU AI Act Application Timeline

The Regulation entered into force July 2, 2024 (20 days after publication June 12, 2024). The staggered application schedule:

DateWhat AppliesKey Obligations
Feb 2, 2025Chapter II (Prohibited AI Practices)AI systems in Art.5 prohibited list must be withdrawn immediately. No transition.
Aug 2, 2025Chapter V (GPAI Models), Chapter VII (Governance), Chapter XII (GPAI penalties)GPAI providers: technical documentation, Art.53 obligations, AI Office registration.
Aug 2, 2026Full application — all remaining provisionsHigh-risk AI: QMS, technical docs, conformity assessments, EU DB registration, CE marking, post-market monitoring, serious incident reporting.
Aug 2, 2027Annex I large-scale IT systemsExtended transition for Schengen/VIS/EES/ETIAS/Eurodac components.

Where we are now (April 2026): We are inside the 24-month transition period. 98 days remain before the Aug 2026 full-application deadline hits most high-risk AI categories.

Who Gets the Extended Window — and Who Doesn't

Systems That Keep the 36-Month Extension (Until Aug 2027)

Annex I of the EU AI Act lists specific large-scale EU IT systems:

If your AI component is integrated into one of these systems, Art.103's 36-month window applies. For most commercial developers, this exception is irrelevant — these are government-operated systems.

Systems Subject to the Aug 2026 Deadline (Annex III)

Annex III defines eight categories of high-risk AI:

  1. Biometric identification and categorisation (remote biometric identification, emotion recognition in sensitive contexts)
  2. Critical infrastructure management (traffic, utilities, digital infrastructure)
  3. Education and vocational training (admission, assessment, exam monitoring)
  4. Employment and worker management (recruitment screening, performance monitoring, task allocation)
  5. Access to essential private/public services (creditworthiness, insurance risk, social scoring)
  6. Law enforcement (polygraph use, AI in criminal investigations, recidivism risk)
  7. Migration and asylum (border screening, asylum application assessment)
  8. Administration of justice (AI in court proceedings, dispute resolution)

If your AI system falls into any of these categories and was placed on the EU market before Aug 2, 2026, you have until Aug 2, 2026 to achieve full compliance — or the Art.103 transition protection expires.

Prohibited Practices — No Transition At All

AI systems in Art.5 (prohibited practices) had zero transition time after Feb 2, 2025:

If you're running any of these, the prohibited practices chapter applied 15 months ago. There is no Art.103 transition for Art.5 violations.

The Substantial Modification Trap

Art.103's transition benefit has a critical exception: substantial modification resets the clock.

What Counts as Substantial Modification

The EU AI Act (Art.3(23)) defines substantial modification as a "change to the AI system after its placing on the market or putting into service which affects the compliance of the AI system or results in a change to the intended purpose for which the AI system has been assessed."

High-risk indicators that a change IS substantial:

Change TypeSubstantial?Reasoning
New training data that changes model behaviour✅ LikelyAffects model performance and risk profile
Model architecture upgrade (v1 → v2)✅ LikelyNew system placed on market
Expansion to new user group✅ LikelyChanges intended purpose
New deployment context (e.g., adding law enforcement use)✅ YesChanges intended purpose materially
Bug fixes and security patches❌ Not substantialMaintains existing performance
UI/UX improvements without affecting AI behaviour❌ Not substantialNo change to model or purpose
Performance tuning within same accuracy range⚠️ BorderlineDepends on magnitude
Threshold adjustments for same purpose⚠️ BorderlineConsult legal counsel
Hardware infrastructure migration (same model)❌ Not substantialNo change to AI system itself

Developer risk: Every AI system update should be documented with a substantial modification assessment. Without documentation, you cannot defend a claim that a change was non-substantial — and a regulator default is to assume substantial.

What Full Application Means: The Aug 2026 Obligation Stack

After Aug 2, 2026, Annex III high-risk AI providers face the full obligations stack:

Quality Management System (Art.9)

You must have a documented QMS covering:

A quality-management-system-compatible documentation standard (ISO 9001, ISO/IEC 42001 for AI) is not mandated but is strong evidence of compliance.

Technical Documentation (Art.11 + Annex IV)

Technical documentation must be "drawn up before placing on the market" and include:

This documentation must be available to national competent authorities on request. Providers must maintain it for 10 years after the last unit is placed on the market.

Conformity Assessment (Art.43)

For most Annex III high-risk AI systems, self-assessment (internal conformity assessment) is permitted. However, for high-risk AI systems in:

A third-party conformity assessment by a Notified Body may be required. The EU's NANDO database lists approved Notified Bodies per country.

EU AI Database Registration (Art.49 + Art.60)

Before placing a high-risk AI system on the EU market, providers must register it in the EU AI database (Commission-managed). Registration requires:

EUID (EU AI system unique identifier) is assigned upon registration and must appear in the declaration of conformity and accompanying materials.

Post-Market Monitoring (Art.72)

Providers must establish a post-market monitoring plan before market placement and actively collect data on system performance in production, including:

Serious incidents (death, health harm, fundamental rights violations) must be reported to the national competent authority within 15 days (Art.65).

Declaration of Conformity + CE Marking (Art.47-48)

For high-risk AI systems, providers must:

  1. Draw up a written declaration of conformity (DoC) referencing applicable EU law
  2. Apply the CE marking to the product or documentation
  3. Maintain the DoC for 10 years

The CE marking signals compliance with all applicable EU law — not just the AI Act. Multi-regulation products (e.g., medical devices with AI) must reference all applicable instruments.

CLOUD Act Interaction: US Infrastructure, EU AI Obligations

Art.103's transition period is a planning window for infrastructure decisions. For US companies serving EU customers:

ScenarioAug 2026 ImpactRecommended Action Before Deadline
US company, model trained in US, deployed to EU usersFull Art.6-15 apply as providerRegister in EU AI database, appoint EU representative (Art.25) before Aug 2026
EU company, model hosted on US cloud (AWS/GCP/Azure)Art.103 obligations apply; CLOUD Act creates technical documentation riskConsider EU-hosted alternative for sensitive AI components
GPAI model provider with EU usersAug 2025 deadline already passedEnsure Art.53 compliance already in place
Legacy high-risk AI system, no changes since 2024Transition benefit applies until Aug 2026Use remaining 98 days for gap analysis and QMS build

sota.io Platform Advantage: EU-hosted AI deployments avoid CLOUD Act exposure in technical documentation. US authorities cannot compel production of AI system documentation hosted on EU infrastructure under Art.48(e) data sovereignty provisions. This is a compliance planning advantage worth capturing before Aug 2026.

Python Aug2026ComplianceTracker

from dataclasses import dataclass, field
from enum import Enum
from datetime import date, timedelta
from typing import Optional

class AISystemCategory(Enum):
    ANNEX_I_LARGE_SCALE_IT = "annex_i_large_scale_it"  # 36-month extension
    ANNEX_III_HIGH_RISK = "annex_iii_high_risk"         # 24-month (Aug 2026)
    GPAI_SYSTEMIC_RISK = "gpai_systemic_risk"            # Aug 2025
    GPAI_GENERAL = "gpai_general"                        # Aug 2025
    PROHIBITED = "prohibited"                            # Feb 2025 (no transition)
    LOW_RISK = "low_risk"                                # Voluntary CoPs only

class ComplianceStatus(Enum):
    COMPLIANT = "compliant"
    IN_PROGRESS = "in_progress"
    NOT_STARTED = "not_started"
    DEADLINE_MISSED = "deadline_missed"

@dataclass
class AISystemAssessment:
    name: str
    category: AISystemCategory
    market_date: date
    last_modification_date: Optional[date] = None
    substantial_modification: bool = False
    
    # Compliance items
    qms_status: ComplianceStatus = ComplianceStatus.NOT_STARTED
    technical_docs_status: ComplianceStatus = ComplianceStatus.NOT_STARTED
    conformity_assessment_status: ComplianceStatus = ComplianceStatus.NOT_STARTED
    eu_db_registration_status: ComplianceStatus = ComplianceStatus.NOT_STARTED
    post_market_monitoring_status: ComplianceStatus = ComplianceStatus.NOT_STARTED
    ce_marking_status: ComplianceStatus = ComplianceStatus.NOT_STARTED

FULL_APPLICATION_DATE = date(2026, 8, 2)
GPAI_APPLICATION_DATE = date(2025, 8, 2)
PROHIBITED_DATE = date(2025, 2, 2)
ANNEX_I_EXTENDED_DATE = date(2027, 8, 2)

def days_until_deadline(system: AISystemAssessment) -> int:
    today = date.today()
    
    if system.category == AISystemCategory.PROHIBITED:
        deadline = PROHIBITED_DATE
    elif system.category in (AISystemCategory.GPAI_GENERAL, AISystemCategory.GPAI_SYSTEMIC_RISK):
        deadline = GPAI_APPLICATION_DATE
    elif system.category == AISystemCategory.ANNEX_I_LARGE_SCALE_IT:
        deadline = ANNEX_I_EXTENDED_DATE
    else:
        deadline = FULL_APPLICATION_DATE
    
    # Substantial modification removes transitional benefit
    if system.substantial_modification:
        return 0  # Deadline already passed — must comply now
    
    return max(0, (deadline - today).days)

def compliance_gap_score(system: AISystemAssessment) -> dict:
    """Returns gap analysis: how many of 6 key obligations are complete."""
    items = [
        ("QMS (Art.9)", system.qms_status),
        ("Technical Docs (Art.11)", system.technical_docs_status),
        ("Conformity Assessment (Art.43)", system.conformity_assessment_status),
        ("EU DB Registration (Art.49/60)", system.eu_db_registration_status),
        ("Post-Market Monitoring (Art.72)", system.post_market_monitoring_status),
        ("CE Marking + DoC (Art.47-48)", system.ce_marking_status),
    ]
    
    compliant = sum(1 for _, s in items if s == ComplianceStatus.COMPLIANT)
    in_progress = sum(1 for _, s in items if s == ComplianceStatus.IN_PROGRESS)
    not_started = sum(1 for _, s in items if s == ComplianceStatus.NOT_STARTED)
    
    remaining_days = days_until_deadline(system)
    
    return {
        "system": system.name,
        "deadline_days": remaining_days,
        "compliance_score": f"{compliant}/6",
        "in_progress": in_progress,
        "not_started": not_started,
        "risk_level": "CRITICAL" if not_started >= 4 and remaining_days < 120 else
                      "HIGH" if not_started >= 2 and remaining_days < 90 else
                      "MEDIUM" if not_started >= 1 else "LOW",
        "items": [(name, status.value) for name, status in items],
    }

def generate_sprint_plan(system: AISystemAssessment) -> list[str]:
    """98-day sprint plan prioritised by dependency order."""
    plan = []
    remaining = days_until_deadline(system)
    
    if remaining <= 0:
        return ["IMMEDIATE: Substantial modification detected — compliance obligations active NOW.",
                "IMMEDIATE: Halt market activities if no conformity assessment exists.",
                "WEEK 1: Engage legal counsel for rapid compliance assessment."]
    
    if system.qms_status == ComplianceStatus.NOT_STARTED:
        plan.append(f"WEEKS 1-4: Implement QMS (Art.9) — foundation for all other obligations")
    if system.technical_docs_status == ComplianceStatus.NOT_STARTED:
        plan.append(f"WEEKS 2-6: Draft technical documentation (Art.11 + Annex IV)")
    if system.conformity_assessment_status == ComplianceStatus.NOT_STARTED:
        plan.append(f"WEEKS 5-8: Complete internal conformity assessment (Art.43)")
    if system.eu_db_registration_status == ComplianceStatus.NOT_STARTED:
        plan.append(f"WEEK 9: Register in EU AI database (Art.60) — requires conformity assessment")
    if system.post_market_monitoring_status == ComplianceStatus.NOT_STARTED:
        plan.append(f"WEEKS 3-7: Establish post-market monitoring plan (Art.72)")
    if system.ce_marking_status == ComplianceStatus.NOT_STARTED:
        plan.append(f"WEEK 10-12: Prepare Declaration of Conformity + CE marking (Art.47-48)")
    
    plan.append(f"ONGOING: Document all system changes with substantial modification assessment")
    plan.append(f"AUG 2, 2026: Deadline. All obligations active. NCA enforcement begins.")
    
    return plan

# Example usage
example_system = AISystemAssessment(
    name="HR Candidate Screening AI v2.3",
    category=AISystemCategory.ANNEX_III_HIGH_RISK,
    market_date=date(2024, 3, 15),
    substantial_modification=False,
    qms_status=ComplianceStatus.IN_PROGRESS,
    technical_docs_status=ComplianceStatus.NOT_STARTED,
    conformity_assessment_status=ComplianceStatus.NOT_STARTED,
    eu_db_registration_status=ComplianceStatus.NOT_STARTED,
    post_market_monitoring_status=ComplianceStatus.IN_PROGRESS,
    ce_marking_status=ComplianceStatus.NOT_STARTED,
)

gap = compliance_gap_score(example_system)
sprint = generate_sprint_plan(example_system)

# Output: {'system': 'HR Candidate Screening AI v2.3', 'deadline_days': 98,
#          'compliance_score': '0/6', 'in_progress': 2, 'not_started': 4,
#          'risk_level': 'CRITICAL', ...}

Interaction with Other EU AI Act Provisions

Art.103 transitional provisions create a compliance cascade — the order in which you complete obligations matters:

Art.103 Transition Period
         │
         ▼
    Risk Assessment
    (Art.9 QMS)
         │
         ▼
   Technical Documentation
   (Art.11 + Annex IV)
         │
         ▼
   Conformity Assessment
   (Art.43 — self or Notified Body)
         │
         ▼
   Declaration of Conformity
   (Art.47)
         │
         ├──► CE Marking (Art.48)
         │
         ▼
   EU AI Database Registration
   (Art.49 + Art.60)
         │
         ▼
   Post-Market Monitoring
   (Art.72 — ongoing)
         │
         ▼
   Serious Incident Reporting
   (Art.65 — 15-day NCA notification)

Key dependencies:

The 98-Day Sprint: What To Do Before Aug 2, 2026

Month 1 (Now — May 31)

Month 2 (June)

Month 3 (July, before Aug 2)

Ongoing After Aug 2

Art.103 vs Other Transitional Provisions in the Act

The EU AI Act contains several transitional provisions across different articles:

ArticleProvisionTimeline
Art.103High-risk AI (Annex III) transitional periodUntil Aug 2, 2026
Art.103(2)Annex I large-scale IT systems extended transitionUntil Aug 2, 2027
Art.99Penalties apply from full application dateFrom Aug 2, 2026
Art.101GPAI model fines from GPAI applicationFrom Aug 2, 2025
Art.5Prohibited practices: no transitional periodFrom Feb 2, 2025
Art.57NCA designation: MS obligationBy Aug 2, 2025
Art.60EU AI database: registration opensBy Aug 2, 2026

20-Item Compliance Checklist: 98-Day Sprint

CLASSIFICATION AND SCOPING

QUALITY MANAGEMENT SYSTEM (ART.9)

TECHNICAL DOCUMENTATION (ART.11 + ANNEX IV)

CONFORMITY ASSESSMENT (ART.43)

REGISTRATION AND MARKING (ART.47-49, ART.60)

POST-MARKET AND MONITORING (ART.65, ART.72)

STRUCTURAL

See Also