August 2, 2026 is 98 days away. On that date, the EU AI Act applies in full to high-risk AI systems — and the transitional period ends for most categories. Article 103 governs who gets extra time, who doesn't, and what "full application" actually means for developers shipping AI products in the EU.
The core question is not whether your AI system is high-risk. It's whether you placed it on the market before Aug 2026 — and whether you've made a substantial modification since then. Get these two answers wrong and you'll face conformity assessment, QMS, technical documentation, and post-market monitoring obligations with no transition window left.
What Art.103 Actually Says
Article 103 — Transitional Provisions (paraphrased from Regulation 2024/1689)
High-risk AI systems referred to in Annex III that have been placed on the market or put into service before the date of application of this Regulation shall comply with the requirements of this Regulation by [24-month period], unless they undergo a substantial modification. High-risk AI systems that are components of large-scale IT systems referred to in Annex I shall comply within [36-month extended period]. Providers and deployers shall take the necessary steps to ensure compliance with the applicable requirements and obligations.
Three practical implications for developers:
- 24-month transition: Most Annex III high-risk AI systems placed on market before Aug 2026 had until Aug 2026 to achieve full compliance — meaning that window closes in 98 days.
- Substantial modification = immediate obligation: If you update your AI system in a way that constitutes a "substantial modification," transitional protection disappears and Art.6-15 apply immediately.
- Annex I exception: AI components in EU large-scale IT systems (Schengen, VIS, EES, ETIAS, Eurodac, etc.) get a longer runway — 36 months from entry into force.
The EU AI Act Application Timeline
The Regulation entered into force July 2, 2024 (20 days after publication June 12, 2024). The staggered application schedule:
| Date | What Applies | Key Obligations |
|---|---|---|
| Feb 2, 2025 | Chapter II (Prohibited AI Practices) | AI systems in Art.5 prohibited list must be withdrawn immediately. No transition. |
| Aug 2, 2025 | Chapter V (GPAI Models), Chapter VII (Governance), Chapter XII (GPAI penalties) | GPAI providers: technical documentation, Art.53 obligations, AI Office registration. |
| Aug 2, 2026 | Full application — all remaining provisions | High-risk AI: QMS, technical docs, conformity assessments, EU DB registration, CE marking, post-market monitoring, serious incident reporting. |
| Aug 2, 2027 | Annex I large-scale IT systems | Extended transition for Schengen/VIS/EES/ETIAS/Eurodac components. |
Where we are now (April 2026): We are inside the 24-month transition period. 98 days remain before the Aug 2026 full-application deadline hits most high-risk AI categories.
Who Gets the Extended Window — and Who Doesn't
Systems That Keep the 36-Month Extension (Until Aug 2027)
Annex I of the EU AI Act lists specific large-scale EU IT systems:
- Schengen Information System II (SIS II) — border checks, law enforcement alerts
- Visa Information System (VIS) — visa applications, biometric data
- Entry/Exit System (EES) — automated border crossing tracking
- European Travel Information and Authorisation System (ETIAS) — pre-travel screening
- Eurodac — fingerprint matching for asylum applications
- ECRIS-TCN — criminal records exchange
- eu-LISA operated systems — EU Agency for the Operational Management of IT systems
If your AI component is integrated into one of these systems, Art.103's 36-month window applies. For most commercial developers, this exception is irrelevant — these are government-operated systems.
Systems Subject to the Aug 2026 Deadline (Annex III)
Annex III defines eight categories of high-risk AI:
- Biometric identification and categorisation (remote biometric identification, emotion recognition in sensitive contexts)
- Critical infrastructure management (traffic, utilities, digital infrastructure)
- Education and vocational training (admission, assessment, exam monitoring)
- Employment and worker management (recruitment screening, performance monitoring, task allocation)
- Access to essential private/public services (creditworthiness, insurance risk, social scoring)
- Law enforcement (polygraph use, AI in criminal investigations, recidivism risk)
- Migration and asylum (border screening, asylum application assessment)
- Administration of justice (AI in court proceedings, dispute resolution)
If your AI system falls into any of these categories and was placed on the EU market before Aug 2, 2026, you have until Aug 2, 2026 to achieve full compliance — or the Art.103 transition protection expires.
Prohibited Practices — No Transition At All
AI systems in Art.5 (prohibited practices) had zero transition time after Feb 2, 2025:
- Real-time remote biometric identification in public spaces (with narrow exceptions)
- Social scoring by public authorities
- Manipulation of vulnerable groups
- Subliminal techniques
- Emotion inference in workplaces/education (with narrow exceptions)
- AI-enabled predictive policing based on profiling
- Facial recognition databases scraped from internet/CCTV
If you're running any of these, the prohibited practices chapter applied 15 months ago. There is no Art.103 transition for Art.5 violations.
The Substantial Modification Trap
Art.103's transition benefit has a critical exception: substantial modification resets the clock.
What Counts as Substantial Modification
The EU AI Act (Art.3(23)) defines substantial modification as a "change to the AI system after its placing on the market or putting into service which affects the compliance of the AI system or results in a change to the intended purpose for which the AI system has been assessed."
High-risk indicators that a change IS substantial:
| Change Type | Substantial? | Reasoning |
|---|---|---|
| New training data that changes model behaviour | ✅ Likely | Affects model performance and risk profile |
| Model architecture upgrade (v1 → v2) | ✅ Likely | New system placed on market |
| Expansion to new user group | ✅ Likely | Changes intended purpose |
| New deployment context (e.g., adding law enforcement use) | ✅ Yes | Changes intended purpose materially |
| Bug fixes and security patches | ❌ Not substantial | Maintains existing performance |
| UI/UX improvements without affecting AI behaviour | ❌ Not substantial | No change to model or purpose |
| Performance tuning within same accuracy range | ⚠️ Borderline | Depends on magnitude |
| Threshold adjustments for same purpose | ⚠️ Borderline | Consult legal counsel |
| Hardware infrastructure migration (same model) | ❌ Not substantial | No change to AI system itself |
Developer risk: Every AI system update should be documented with a substantial modification assessment. Without documentation, you cannot defend a claim that a change was non-substantial — and a regulator default is to assume substantial.
What Full Application Means: The Aug 2026 Obligation Stack
After Aug 2, 2026, Annex III high-risk AI providers face the full obligations stack:
Quality Management System (Art.9)
You must have a documented QMS covering:
- Risk management procedures
- Data governance processes
- Technical documentation maintenance
- Conformity assessment procedures
- Post-market monitoring plan
- Incident response procedures
- Change management (substantial modification assessment)
A quality-management-system-compatible documentation standard (ISO 9001, ISO/IEC 42001 for AI) is not mandated but is strong evidence of compliance.
Technical Documentation (Art.11 + Annex IV)
Technical documentation must be "drawn up before placing on the market" and include:
- General description of the AI system and its intended purpose
- Description of the development process including training methodology
- Information about training, validation, and testing data
- Description of monitoring, functioning, and control measures
- Detailed information about risk and performance metrics
- Limitations of the system and conditions for intended use
This documentation must be available to national competent authorities on request. Providers must maintain it for 10 years after the last unit is placed on the market.
Conformity Assessment (Art.43)
For most Annex III high-risk AI systems, self-assessment (internal conformity assessment) is permitted. However, for high-risk AI systems in:
- Remote biometric identification (Annex III Cat. 1)
- Critical infrastructure (Annex III Cat. 2)
- Educational testing and monitoring (where substantial human oversight is absent)
A third-party conformity assessment by a Notified Body may be required. The EU's NANDO database lists approved Notified Bodies per country.
EU AI Database Registration (Art.49 + Art.60)
Before placing a high-risk AI system on the EU market, providers must register it in the EU AI database (Commission-managed). Registration requires:
- Provider identity and contact information
- AI system name, version, intended purpose
- Risk category and applicable Annex
- Compliance status and conformity assessment reference
- Countries of deployment
EUID (EU AI system unique identifier) is assigned upon registration and must appear in the declaration of conformity and accompanying materials.
Post-Market Monitoring (Art.72)
Providers must establish a post-market monitoring plan before market placement and actively collect data on system performance in production, including:
- Accuracy and performance metrics over time
- Incidents and near-misses
- User feedback on errors
- Changes in deployment context
Serious incidents (death, health harm, fundamental rights violations) must be reported to the national competent authority within 15 days (Art.65).
Declaration of Conformity + CE Marking (Art.47-48)
For high-risk AI systems, providers must:
- Draw up a written declaration of conformity (DoC) referencing applicable EU law
- Apply the CE marking to the product or documentation
- Maintain the DoC for 10 years
The CE marking signals compliance with all applicable EU law — not just the AI Act. Multi-regulation products (e.g., medical devices with AI) must reference all applicable instruments.
CLOUD Act Interaction: US Infrastructure, EU AI Obligations
Art.103's transition period is a planning window for infrastructure decisions. For US companies serving EU customers:
| Scenario | Aug 2026 Impact | Recommended Action Before Deadline |
|---|---|---|
| US company, model trained in US, deployed to EU users | Full Art.6-15 apply as provider | Register in EU AI database, appoint EU representative (Art.25) before Aug 2026 |
| EU company, model hosted on US cloud (AWS/GCP/Azure) | Art.103 obligations apply; CLOUD Act creates technical documentation risk | Consider EU-hosted alternative for sensitive AI components |
| GPAI model provider with EU users | Aug 2025 deadline already passed | Ensure Art.53 compliance already in place |
| Legacy high-risk AI system, no changes since 2024 | Transition benefit applies until Aug 2026 | Use remaining 98 days for gap analysis and QMS build |
sota.io Platform Advantage: EU-hosted AI deployments avoid CLOUD Act exposure in technical documentation. US authorities cannot compel production of AI system documentation hosted on EU infrastructure under Art.48(e) data sovereignty provisions. This is a compliance planning advantage worth capturing before Aug 2026.
Python Aug2026ComplianceTracker
from dataclasses import dataclass, field
from enum import Enum
from datetime import date, timedelta
from typing import Optional
class AISystemCategory(Enum):
ANNEX_I_LARGE_SCALE_IT = "annex_i_large_scale_it" # 36-month extension
ANNEX_III_HIGH_RISK = "annex_iii_high_risk" # 24-month (Aug 2026)
GPAI_SYSTEMIC_RISK = "gpai_systemic_risk" # Aug 2025
GPAI_GENERAL = "gpai_general" # Aug 2025
PROHIBITED = "prohibited" # Feb 2025 (no transition)
LOW_RISK = "low_risk" # Voluntary CoPs only
class ComplianceStatus(Enum):
COMPLIANT = "compliant"
IN_PROGRESS = "in_progress"
NOT_STARTED = "not_started"
DEADLINE_MISSED = "deadline_missed"
@dataclass
class AISystemAssessment:
name: str
category: AISystemCategory
market_date: date
last_modification_date: Optional[date] = None
substantial_modification: bool = False
# Compliance items
qms_status: ComplianceStatus = ComplianceStatus.NOT_STARTED
technical_docs_status: ComplianceStatus = ComplianceStatus.NOT_STARTED
conformity_assessment_status: ComplianceStatus = ComplianceStatus.NOT_STARTED
eu_db_registration_status: ComplianceStatus = ComplianceStatus.NOT_STARTED
post_market_monitoring_status: ComplianceStatus = ComplianceStatus.NOT_STARTED
ce_marking_status: ComplianceStatus = ComplianceStatus.NOT_STARTED
FULL_APPLICATION_DATE = date(2026, 8, 2)
GPAI_APPLICATION_DATE = date(2025, 8, 2)
PROHIBITED_DATE = date(2025, 2, 2)
ANNEX_I_EXTENDED_DATE = date(2027, 8, 2)
def days_until_deadline(system: AISystemAssessment) -> int:
today = date.today()
if system.category == AISystemCategory.PROHIBITED:
deadline = PROHIBITED_DATE
elif system.category in (AISystemCategory.GPAI_GENERAL, AISystemCategory.GPAI_SYSTEMIC_RISK):
deadline = GPAI_APPLICATION_DATE
elif system.category == AISystemCategory.ANNEX_I_LARGE_SCALE_IT:
deadline = ANNEX_I_EXTENDED_DATE
else:
deadline = FULL_APPLICATION_DATE
# Substantial modification removes transitional benefit
if system.substantial_modification:
return 0 # Deadline already passed — must comply now
return max(0, (deadline - today).days)
def compliance_gap_score(system: AISystemAssessment) -> dict:
"""Returns gap analysis: how many of 6 key obligations are complete."""
items = [
("QMS (Art.9)", system.qms_status),
("Technical Docs (Art.11)", system.technical_docs_status),
("Conformity Assessment (Art.43)", system.conformity_assessment_status),
("EU DB Registration (Art.49/60)", system.eu_db_registration_status),
("Post-Market Monitoring (Art.72)", system.post_market_monitoring_status),
("CE Marking + DoC (Art.47-48)", system.ce_marking_status),
]
compliant = sum(1 for _, s in items if s == ComplianceStatus.COMPLIANT)
in_progress = sum(1 for _, s in items if s == ComplianceStatus.IN_PROGRESS)
not_started = sum(1 for _, s in items if s == ComplianceStatus.NOT_STARTED)
remaining_days = days_until_deadline(system)
return {
"system": system.name,
"deadline_days": remaining_days,
"compliance_score": f"{compliant}/6",
"in_progress": in_progress,
"not_started": not_started,
"risk_level": "CRITICAL" if not_started >= 4 and remaining_days < 120 else
"HIGH" if not_started >= 2 and remaining_days < 90 else
"MEDIUM" if not_started >= 1 else "LOW",
"items": [(name, status.value) for name, status in items],
}
def generate_sprint_plan(system: AISystemAssessment) -> list[str]:
"""98-day sprint plan prioritised by dependency order."""
plan = []
remaining = days_until_deadline(system)
if remaining <= 0:
return ["IMMEDIATE: Substantial modification detected — compliance obligations active NOW.",
"IMMEDIATE: Halt market activities if no conformity assessment exists.",
"WEEK 1: Engage legal counsel for rapid compliance assessment."]
if system.qms_status == ComplianceStatus.NOT_STARTED:
plan.append(f"WEEKS 1-4: Implement QMS (Art.9) — foundation for all other obligations")
if system.technical_docs_status == ComplianceStatus.NOT_STARTED:
plan.append(f"WEEKS 2-6: Draft technical documentation (Art.11 + Annex IV)")
if system.conformity_assessment_status == ComplianceStatus.NOT_STARTED:
plan.append(f"WEEKS 5-8: Complete internal conformity assessment (Art.43)")
if system.eu_db_registration_status == ComplianceStatus.NOT_STARTED:
plan.append(f"WEEK 9: Register in EU AI database (Art.60) — requires conformity assessment")
if system.post_market_monitoring_status == ComplianceStatus.NOT_STARTED:
plan.append(f"WEEKS 3-7: Establish post-market monitoring plan (Art.72)")
if system.ce_marking_status == ComplianceStatus.NOT_STARTED:
plan.append(f"WEEK 10-12: Prepare Declaration of Conformity + CE marking (Art.47-48)")
plan.append(f"ONGOING: Document all system changes with substantial modification assessment")
plan.append(f"AUG 2, 2026: Deadline. All obligations active. NCA enforcement begins.")
return plan
# Example usage
example_system = AISystemAssessment(
name="HR Candidate Screening AI v2.3",
category=AISystemCategory.ANNEX_III_HIGH_RISK,
market_date=date(2024, 3, 15),
substantial_modification=False,
qms_status=ComplianceStatus.IN_PROGRESS,
technical_docs_status=ComplianceStatus.NOT_STARTED,
conformity_assessment_status=ComplianceStatus.NOT_STARTED,
eu_db_registration_status=ComplianceStatus.NOT_STARTED,
post_market_monitoring_status=ComplianceStatus.IN_PROGRESS,
ce_marking_status=ComplianceStatus.NOT_STARTED,
)
gap = compliance_gap_score(example_system)
sprint = generate_sprint_plan(example_system)
# Output: {'system': 'HR Candidate Screening AI v2.3', 'deadline_days': 98,
# 'compliance_score': '0/6', 'in_progress': 2, 'not_started': 4,
# 'risk_level': 'CRITICAL', ...}
Interaction with Other EU AI Act Provisions
Art.103 transitional provisions create a compliance cascade — the order in which you complete obligations matters:
Art.103 Transition Period
│
▼
Risk Assessment
(Art.9 QMS)
│
▼
Technical Documentation
(Art.11 + Annex IV)
│
▼
Conformity Assessment
(Art.43 — self or Notified Body)
│
▼
Declaration of Conformity
(Art.47)
│
├──► CE Marking (Art.48)
│
▼
EU AI Database Registration
(Art.49 + Art.60)
│
▼
Post-Market Monitoring
(Art.72 — ongoing)
│
▼
Serious Incident Reporting
(Art.65 — 15-day NCA notification)
Key dependencies:
- Registration (Art.60) cannot happen before conformity assessment: You need a conformity assessment reference to register.
- CE marking requires DoC: The DoC must reference the conformity assessment basis.
- Post-market monitoring must start at placement: Not a one-time exercise — it's an ongoing obligation from Aug 2, 2026.
The 98-Day Sprint: What To Do Before Aug 2, 2026
Month 1 (Now — May 31)
- Classify every AI system in your product: Which fall into Annex III categories? Map against the 8 categories carefully.
- Assess each for substantial modification: Have you updated training data, model architecture, or intended purpose since July 2024?
- Start QMS foundation: Even a lightweight documented procedure manual satisfies Art.9's basic requirements for smaller providers.
- Identify if Notified Body is required: For most Annex III categories, self-assessment is permitted — check whether your specific use case requires third-party review.
Month 2 (June)
- Draft technical documentation (Annex IV): For each high-risk AI system. Focus on training data documentation, performance metrics, and risk controls.
- Map post-market monitoring procedures: What signals will you collect? How will you identify serious incidents? Who triggers the Art.65 15-day report?
- Register EU representative if you're a non-EU provider: Art.25 authorised representative must be established before market placement obligations kick in.
Month 3 (July, before Aug 2)
- Complete internal conformity assessment: Document the assessment basis, applicable standards, and compliance conclusion.
- Register in EU AI database: Create EUID before Aug 2. Early registration is available.
- Finalise Declaration of Conformity: Must reference the conformity assessment and all applicable EU law.
- Prepare CE marking documentation: For physical products or documentation that accompanies the AI system.
- Brief your legal team: Post-Aug 2, market surveillance authorities begin enforcement. Substantial modification assessments must be immediate from this date.
Ongoing After Aug 2
- Operate post-market monitoring plan — collect performance data, log near-misses, document user feedback.
- Submit serious incident reports (Art.65) within 15 days of detection.
- Maintain all documentation for 10 years.
- Assess every system update for substantial modification — document the assessment outcome.
Art.103 vs Other Transitional Provisions in the Act
The EU AI Act contains several transitional provisions across different articles:
| Article | Provision | Timeline |
|---|---|---|
| Art.103 | High-risk AI (Annex III) transitional period | Until Aug 2, 2026 |
| Art.103(2) | Annex I large-scale IT systems extended transition | Until Aug 2, 2027 |
| Art.99 | Penalties apply from full application date | From Aug 2, 2026 |
| Art.101 | GPAI model fines from GPAI application | From Aug 2, 2025 |
| Art.5 | Prohibited practices: no transitional period | From Feb 2, 2025 |
| Art.57 | NCA designation: MS obligation | By Aug 2, 2025 |
| Art.60 | EU AI database: registration opens | By Aug 2, 2026 |
20-Item Compliance Checklist: 98-Day Sprint
CLASSIFICATION AND SCOPING
- 1. Inventory all AI systems deployed in or to the EU market
- 2. Classify each system against Annex III (8 categories) + Annex I (large-scale IT systems)
- 3. Document the risk classification decision with legal rationale
- 4. For each Annex III system: assess whether placed on market before Aug 2026
- 5. Assess every system update since July 2024 for substantial modification — document outcomes
QUALITY MANAGEMENT SYSTEM (ART.9)
- 6. Establish QMS covering risk management, data governance, change management
- 7. Define change management procedure with substantial modification assessment template
- 8. Document monitoring, corrective action, and incident response procedures
TECHNICAL DOCUMENTATION (ART.11 + ANNEX IV)
- 9. Draft technical documentation for each Annex III system (Annex IV checklist)
- 10. Include training data description, performance metrics, and risk controls
- 11. Document human oversight measures and fail-safe mechanisms
CONFORMITY ASSESSMENT (ART.43)
- 12. Determine whether self-assessment or Notified Body assessment is required
- 13. Complete internal conformity assessment or engage Notified Body
REGISTRATION AND MARKING (ART.47-49, ART.60)
- 14. Register in EU AI database (EUID will be assigned) — opens before Aug 2026
- 15. Draft Declaration of Conformity (Art.47)
- 16. Prepare CE marking for product documentation
POST-MARKET AND MONITORING (ART.65, ART.72)
- 17. Establish post-market monitoring plan — define data collection and review cycle
- 18. Set up serious incident reporting procedure (Art.65 — 15-day NCA notification)
- 19. Designate responsible person for monitoring plan execution
STRUCTURAL
- 20. For non-EU providers: appoint EU authorised representative (Art.25) before Aug 2026
See Also
- EU AI Act Art.99: Penalties — €35M/7% Administrative Fines for High-Risk AI Violations
- EU AI Act Art.43: Conformity Assessment Procedures for High-Risk AI Systems
- EU AI Act Art.72: Post-Market Monitoring Plan Obligations Developer Guide
- EU AI Act Art.60: EU AI Database Public Registry — EUID Registration
- EU AI Act Art.65: Reporting Serious Incidents — 15-Day NCA Notification