2026-04-26·15 min read·

If your AI system is embedded in a physical product subject to EU type-approval or CE marking, you have two compliance regimes to satisfy simultaneously. EU AI Act Arts.104–112 are the amendment articles that formally integrate the AI Act into existing EU sector legislation — creating legal bridges between the horizontal AI regulation and vertical product safety frameworks covering vehicles, aircraft, industrial machinery, and aviation security systems.

Article 104 begins the sequence of amendments. Understanding how these amendments work — and specifically how the Annex I pathway coordinates conformity assessment between the EU AI Act and sector regulations — is essential for any developer building AI systems for dual-regulated markets.

What Arts.104–112 Actually Do

The EU AI Act does not simply add to the compliance burden for sector-regulated AI systems. It creates an integrated framework with specific coordination mechanisms designed to avoid double assessments. The amendment articles (Arts.104–112) formally amend existing EU regulations to:

  1. Acknowledge AI Act authority — They insert EU AI Act cross-references into sector regulations, establishing the legal relationship between the two compliance frameworks.
  2. Clarify dual compliance obligations — They confirm that AI components in regulated products must satisfy both the sector regulation and the EU AI Act where applicable.
  3. Enable assessment coordination — They create the legal basis for combined conformity assessment procedures and single-notified-body pathways where practicable.

The sectors covered by Arts.104–112 include civil aviation security (Regulation (EC) No 300/2008), agricultural and forestry vehicles (Regulation (EU) No 167/2013), two/three-wheel vehicles (Regulation (EU) No 168/2013), motor vehicles (Regulation (EU) 2018/858), civil aviation safety (Regulation (EU) 2018/1139), and vehicle type-approval safety requirements (Regulation (EU) 2019/2144).

The Foundation: Annex I and the High-Risk Classification Pathway

Art.104's amendment context only makes sense once you understand Annex I to the EU AI Act. Annex I lists the EU harmonization legislation under which an AI system can qualify as high-risk through the Article 6(1) pathway — distinct from the Annex III pathway that lists specific application areas.

Article 6(1) — The Annex I pathway:

An AI system that is a safety component of a product, or is itself a product, covered by EU harmonization legislation listed in Annex I, and the product is required to undergo a third-party conformity assessment by a notified body under that harmonization legislation, shall be considered a high-risk AI system.

The Annex I list currently includes:

EU InstrumentSectorAI Relevance
Regulation (EU) 2018/858Motor vehiclesADAS, autonomous driving, driver monitoring
Regulation (EU) 2018/1139Civil aviationAircraft AI systems, ATC decision support
Regulation (EU) 2019/2144Vehicle type-approvalISA, AEB, lane-keeping, drowsiness detection
Regulation (EU) 167/2013Agricultural vehiclesAutonomous guidance, obstacle detection
Regulation (EU) 168/2013Two/three-wheel vehiclesABS controllers, emergency braking assistance
Regulation (EU) 2023/1230MachineryAI in robots, CNC machines, production equipment
Directive 2006/42/ECMachinery (legacy)Safety functions in CE-marked machinery
Regulation (EU) 2017/745 (MDR)Medical devicesAI diagnostics, clinical decision support
Regulation (EU) 2017/746 (IVDR)In vitro diagnosticsAI-based laboratory analysis
Directive 2014/53/EU (RED)Radio equipmentAI in smart devices, IoT with radio
Directive 2009/48/ECToysAI in interactive toys
Directive 2013/53/EURecreational craftAI navigation and safety systems

The critical trigger: If your AI system is a safety component in a product requiring third-party conformity assessment under any Annex I regulation, your AI system is automatically high-risk under the EU AI Act, regardless of whether it would be classified as high-risk under Annex III.

The Conformity Assessment Coordination Mechanism

This is where Art.6(1) and Art.6(2) interact to create the dual compliance framework — and where developers most often get confused.

Art.6(1): Annex I Pathway — Third-Party Assessment Required

For AI systems that trigger Art.6(1) (Annex I product with third-party conformity assessment requirement), the EU AI Act does not automatically add a separate AI-specific conformity assessment. Instead, Article 43(2) provides a coordination mechanism:

Where AI systems are required to undergo a third-party conformity assessment under the harmonisation legislation listed in Annex I, that assessment shall be carried out under the corresponding requirements of the relevant Union harmonisation legislation. The notified body responsible for checking compliance with that legislation shall also be responsible for the conformity assessment of the AI system to the extent that it is integrated or constitutes a safety component of that product.

Practical implication: Your sector-specific notified body performs both the sector assessment AND the AI Act compliance check. You do not need two separate assessments. The AI Act requirements (technical documentation, QMS, risk management, post-market monitoring) are checked as part of the existing sector assessment.

Art.6(2): Annex III Pathway — Self-Assessment or Separate Third-Party

For AI systems classified as high-risk under Annex III (not Annex I), the conformity assessment follows Article 43's standard procedure — typically self-assessment for most Annex III categories, with third-party assessment required for remote biometric identification and a few other categories.

The intersection problem: Some AI systems trigger BOTH Art.6(1) AND Art.6(2). A pedestrian detection system in a car is:

When both pathways apply, Art.6(1) takes precedence for conformity assessment purposes — the sector-specific assessment covers both.

Per-Sector Analysis: What Changes Under Arts.104–112

Automotive: Regulations 167/2013, 168/2013, 2018/858, 2019/2144

Regulation (EU) 2018/858 (motor vehicles) covers whole vehicle type-approval (WVTA) for passenger cars, commercial vehicles, and trailers. AI systems fall into:

Regulation (EU) 2019/2144 (type-approval safety features) mandates specific AI-enabled safety features for new vehicles from 2022–2026:

For these mandatory features, type-approval is the conformity assessment mechanism. Satisfying type-approval via UNECE regulations and the technical service satisfies the EU AI Act conformity assessment requirement for these specific systems.

Regulation (EU) 167/2013 (agricultural vehicles): Covers tractors, harvesters, forestry machines. AI applications:

Regulation (EU) 168/2013 (two/three-wheel vehicles): Covers motorcycles, mopeds, scooters, quadricycles. AI applications:

Civil Aviation: Regulation (EU) 2018/1139 (EASA)

Regulation 2018/1139 is the EASA regulation governing aircraft design, manufacturing, and maintenance. It applies to commercial aircraft and large drones. AI applications:

AI ApplicationEASA ClassificationEU AI Act Classification
Flight management system AI componentsDesign Organisation Approval (DOA) requiredHigh-risk via Art.6(1) Annex I
Predictive maintenance AIService organisation approval requiredPotentially high-risk via Art.6(1)
Air Traffic Control decision supportCovered under EUROCONTROL/EASA frameworksHigh-risk via Annex III (critical infrastructure)
Airport security AI screeningReg (EC) No 300/2008 frameworkHigh-risk via Annex III (law enforcement adjacent)
Drone autonomy systemsEASA UAS regulation (EU) 2019/947High-risk if commercial

The EASA-AI Act coordination challenge: EASA uses its own Certification Specifications (CS) and Acceptable Means of Compliance (AMC) documents. EASA AMC-2 CS-25.1309 covers software development assurance using DO-178C/DO-254 standards. The EU AI Act requires different documentation (Annex IV technical documentation). These two documentation frameworks do not map cleanly onto each other.

Practical approach: Use a dual-documentation structure where:

  1. The DO-178C/DO-254 artifacts satisfy EASA certification
  2. An Annex IV wrapper document cross-references the certification artifacts to satisfy EU AI Act technical documentation
  3. The Design Organisation Approval process functions as the EU AI Act third-party conformity assessment

Civil Aviation Security: Regulation (EC) No 300/2008

Regulation (EC) No 300/2008 establishes minimum standards for civil aviation security in the EU, covering passenger and baggage screening, cargo security, and access control. AI applications in scope:

Critical intersection — Art.5 prohibited practices in aviation security: Some AI applications used routinely in aviation security contexts may constitute prohibited AI practices under EU AI Act Art.5:

Aviation security developers must audit their systems against Art.5 before assuming high-risk AI compliance is sufficient.

Machinery: Regulation (EU) 2023/1230

The Machinery Regulation (Reg 2023/1230, replacing Directive 2006/42/EC from 2027) explicitly addresses AI-enabled safety functions and is listed in Annex I to the EU AI Act.

Key coexistence rules:

The higher standard rule: Where the EU AI Act and Machinery Regulation impose different standards for the same aspect, the higher standard applies. The EU AI Act's bias and accuracy requirements (Art.10) may exceed what EN ISO 13849 addresses for learning systems — the AI Act requirement applies.

Dual CE Marking When Both Regimes Apply

When an AI system must satisfy both the EU AI Act and a sector regulation, CE marking documentation must reference both instruments:

Declaration of Conformity (DoC) structure for dual-regulated AI:

DECLARATION OF CONFORMITY
Product: [Name and model]
Manufacturer: [Company, address]

This product conforms with:
1. Regulation (EU) 2024/1689 (EU AI Act), Article 6(1)/(2), as a [high-risk AI system / provider of general-purpose AI model]
   Conformity assessment procedure: Article 43(2) — assessed by [notified body name, NB number] under [sector regulation name]

2. Regulation (EU) [sector regulation number], as a [product type]
   Type-examination certificate: [Certificate number] issued by [notified body name, NB number]
   [Other sector-specific declarations as required]

The above-mentioned product satisfies all applicable essential requirements of both Regulations.
Signed by: [Authorised person, date, place]

One DoC or two? EU AI Act Art.47(2) allows the EU AI Act DoC to be integrated into the sector-specific DoC where technically feasible. The combined document must reference both instruments and be clear about which conformity assessment covered which requirements.

Documentation Consolidation Strategy

Managing two sets of compliance documentation is expensive. A consolidation strategy reduces cost:

Documentation ElementEU AI Act (Annex IV)Sector RegulationConsolidation Approach
Technical description + capabilitiesYesYes (varies)Unified technical file with dual references
Risk management recordsArt.9 — systematic risk managementFMEA, HAZOP (machinery/automotive), FHA (aviation)Use sector methodology, add AI-specific risks as annex
Validation and testing dataArt.10 — dataset documentationTest reports per sector standardSector test report + AI dataset annex
Post-market monitoring planArt.72 — mandatoryAdverse event reporting (MDR/IVDR), field performance monitoringUnified monitoring plan covering both obligations
QMS recordsArt.9 — quality managementISO 9001 / IATF 16949 / EN AS 9100Existing QMS + AI Act supplement
Incident reporting procedureArt.65 — 15-day NCA notificationSector-specific vigilance reporting timelinesDual-trigger procedure document

Critical difference — incident reporting timelines:

A dual-regulated AI developer must identify the most restrictive timeline and build their incident response around it.

CLOUD Act Intersection for Dual-Regulated AI

Dual-regulated AI systems often involve sensitive compliance data that creates compounded CLOUD Act exposure:

Aerospace / aviation AI:

Automotive AI:

Recommended approach: Host all dual-regulated technical documentation, test artifacts, and post-market monitoring data on EU-sovereign infrastructure outside CLOUD Act reach. Particularly critical for aviation safety data and automotive crash test results.

Python Implementation: SectorRegulationComplianceMapper

from dataclasses import dataclass, field
from enum import Enum
from typing import Optional


class SectorRegulation(Enum):
    MOTOR_VEHICLES = "Regulation (EU) 2018/858"
    VEHICLE_TYPE_APPROVAL = "Regulation (EU) 2019/2144"
    AGRICULTURAL_VEHICLES = "Regulation (EU) 167/2013"
    TWO_WHEEL_VEHICLES = "Regulation (EU) 168/2013"
    CIVIL_AVIATION = "Regulation (EU) 2018/1139"
    AVIATION_SECURITY = "Regulation (EC) 300/2008"
    MACHINERY = "Regulation (EU) 2023/1230"
    MEDICAL_DEVICES = "Regulation (EU) 2017/745"
    IVDR = "Regulation (EU) 2017/746"
    RADIO_EQUIPMENT = "Directive 2014/53/EU"
    NONE = "No sector regulation applicable"


class ConformityAssessmentPath(Enum):
    SECTOR_COVERS_AI_ACT = "Art.43(2): Sector assessment covers EU AI Act"
    PARALLEL_ASSESSMENTS = "Art.43(1): Separate EU AI Act + sector assessments"
    SELF_ASSESSMENT = "Art.43(5): Self-assessment (Annex IX procedure)"
    NOT_HIGH_RISK = "Not a high-risk AI system"


@dataclass
class AISystemProfile:
    name: str
    sector_regulation: SectorRegulation
    requires_third_party_sector_assessment: bool
    annex_iii_category: Optional[str] = None  # e.g., "Cat.2 Critical Infrastructure"
    has_safety_function: bool = False
    involves_biometrics: bool = False
    publicly_accessible_deployment: bool = False


@dataclass
class ComplianceMapping:
    profile: AISystemProfile
    is_high_risk_annex_i: bool = False
    is_high_risk_annex_iii: bool = False
    conformity_path: ConformityAssessmentPath = ConformityAssessmentPath.NOT_HIGH_RISK
    risk_flags: list[str] = field(default_factory=list)
    documentation_gaps: list[str] = field(default_factory=list)
    incident_reporting_deadline_days: int = 15


class SectorRegulationComplianceMapper:
    def __init__(self):
        self.annex_i_regulations = {
            SectorRegulation.MOTOR_VEHICLES,
            SectorRegulation.VEHICLE_TYPE_APPROVAL,
            SectorRegulation.AGRICULTURAL_VEHICLES,
            SectorRegulation.TWO_WHEEL_VEHICLES,
            SectorRegulation.CIVIL_AVIATION,
            SectorRegulation.MACHINERY,
            SectorRegulation.MEDICAL_DEVICES,
            SectorRegulation.IVDR,
            SectorRegulation.RADIO_EQUIPMENT,
        }
        # Most restrictive incident reporting timelines per sector
        self.sector_incident_deadlines = {
            SectorRegulation.CIVIL_AVIATION: 3,      # 72 hours (EASA)
            SectorRegulation.MEDICAL_DEVICES: 15,    # 15 calendar days (MDR Art.87)
            SectorRegulation.IVDR: 15,
            SectorRegulation.MOTOR_VEHICLES: 30,     # ~30 days (UNECE field safety)
            SectorRegulation.VEHICLE_TYPE_APPROVAL: 30,
            SectorRegulation.MACHINERY: 15,          # Defaults to AI Act 15 working days
        }

    def map(self, profile: AISystemProfile) -> ComplianceMapping:
        mapping = ComplianceMapping(profile=profile)

        # Step 1: Check Annex I high-risk classification
        if (
            profile.sector_regulation in self.annex_i_regulations
            and profile.requires_third_party_sector_assessment
        ):
            mapping.is_high_risk_annex_i = True

        # Step 2: Check Annex III high-risk classification
        if profile.annex_iii_category:
            mapping.is_high_risk_annex_iii = True

        # Step 3: Determine conformity assessment path
        if not mapping.is_high_risk_annex_i and not mapping.is_high_risk_annex_iii:
            mapping.conformity_path = ConformityAssessmentPath.NOT_HIGH_RISK
        elif mapping.is_high_risk_annex_i:
            # Art.43(2): Sector assessment covers AI Act
            mapping.conformity_path = ConformityAssessmentPath.SECTOR_COVERS_AI_ACT
        elif mapping.is_high_risk_annex_iii:
            mapping.conformity_path = ConformityAssessmentPath.SELF_ASSESSMENT

        # Step 4: Assess risk flags
        if profile.involves_biometrics and profile.publicly_accessible_deployment:
            mapping.risk_flags.append(
                "CRITICAL: Real-time remote biometric identification in public space — "
                "Art.5 prohibition applies unless specific law enforcement exception"
            )
        if profile.sector_regulation == SectorRegulation.AVIATION_SECURITY:
            mapping.risk_flags.append(
                "AUDIT: Review against Art.5 prohibited practices — emotion recognition "
                "and passenger profiling AI common in aviation security may be prohibited"
            )

        # Step 5: Identify documentation gaps
        if mapping.is_high_risk_annex_i and mapping.is_high_risk_annex_iii:
            mapping.documentation_gaps.append(
                "Both Annex I and Annex III apply: Sector assessment (Art.43(2)) takes "
                "precedence for conformity, but document BOTH classification bases"
            )
        if profile.sector_regulation == SectorRegulation.CIVIL_AVIATION:
            mapping.documentation_gaps.append(
                "Map DO-178C/DO-254 artifacts to Annex IV EU AI Act requirements — "
                "create cross-reference table for notified body review"
            )

        # Step 6: Set incident reporting deadline (most restrictive)
        sector_deadline = self.sector_incident_deadlines.get(
            profile.sector_regulation, 15
        )
        mapping.incident_reporting_deadline_days = min(sector_deadline, 15)

        return mapping

    def generate_report(self, mapping: ComplianceMapping) -> str:
        p = mapping.profile
        lines = [
            f"=== Dual Compliance Report: {p.name} ===",
            f"Sector Regulation: {p.sector_regulation.value}",
            f"",
            f"HIGH-RISK CLASSIFICATION:",
            f"  Annex I pathway: {'YES' if mapping.is_high_risk_annex_i else 'NO'}",
            f"  Annex III pathway: {'YES — ' + p.annex_iii_category if mapping.is_high_risk_annex_iii else 'NO'}",
            f"",
            f"CONFORMITY ASSESSMENT PATH: {mapping.conformity_path.value}",
            f"INCIDENT REPORTING DEADLINE: {mapping.incident_reporting_deadline_days} days",
            f"",
        ]
        if mapping.risk_flags:
            lines.append("RISK FLAGS:")
            for flag in mapping.risk_flags:
                lines.append(f"  ⚠ {flag}")
            lines.append("")
        if mapping.documentation_gaps:
            lines.append("DOCUMENTATION ACTIONS:")
            for gap in mapping.documentation_gaps:
                lines.append(f"  → {gap}")
        return "\n".join(lines)


# Example usage
mapper = SectorRegulationComplianceMapper()

# Automotive ADAS system
adas_profile = AISystemProfile(
    name="Pedestrian Detection AEB System",
    sector_regulation=SectorRegulation.VEHICLE_TYPE_APPROVAL,
    requires_third_party_sector_assessment=True,
    annex_iii_category="Cat.2 Critical Infrastructure (Transport)",
    has_safety_function=True,
)
adas_mapping = mapper.map(adas_profile)
print(mapper.generate_report(adas_mapping))
# → Annex I: YES, Annex III: YES, Path: Art.43(2) sector covers AI Act

# Airport baggage screening AI
baggage_profile = AISystemProfile(
    name="Automated Threat Detection AI (X-ray)",
    sector_regulation=SectorRegulation.AVIATION_SECURITY,
    requires_third_party_sector_assessment=True,
    annex_iii_category="Cat.6 Law Enforcement (threat detection)",
    has_safety_function=True,
    publicly_accessible_deployment=True,
)
baggage_mapping = mapper.map(baggage_profile)
print(mapper.generate_report(baggage_mapping))
# → Annex I: YES, Annex III: YES, AUDIT: Art.5 prohibited practices review

Art.5 Prohibited Practice Audit for Sector AI

Before assuming dual-compliance is your only concern, verify your AI system does not constitute a prohibited practice under Art.5. Several sector-common AI applications are at risk:

Sector AI ApplicationArt.5 RiskAnalysis
Emotion recognition at airport boardingHIGHArt.5(1)(f) prohibits emotion recognition in professional/public contexts — narrow transport safety exception contested
Biometric passenger categorisation (nationality, age profiling)HIGHArt.5(1)(b) prohibits biometric categorisation inferring sensitive attributes
Real-time facial recognition at airport terminals (law enforcement)CONDITIONALArt.5(1)(h) — law enforcement exception available but with strict conditions (judicial authorisation, specific offences)
Social scoring for passenger risk profilesHIGHArt.5(1)(c) general social scoring prohibition — aviation security profiling must be crime/threat-specific, not general behaviour
Driver drowsiness detection in vehiclesLOWMandatory under 2019/2144, scope limited to driving safety — not a general emotion inference system
Predictive maintenance AI in aircraftNONEMaintenance scheduling not in prohibited categories

30-Item Dual Compliance Readiness Checklist

CLASSIFICATION AND SCOPE

CONFORMITY ASSESSMENT

DOCUMENTATION CONSOLIDATION

DECLARATION AND MARKING

POST-MARKET AND INCIDENT REPORTING

SUPPLY CHAIN

PROHIBITED PRACTICES AND SPECIAL RISKS

ONGOING COMPLIANCE

See Also