2026-04-14·15 min read·sota.io team

EU AI Act Art.107: Amendments to Other EU Legislation — Cross-Regulatory Integration Developer Guide (2026)

The EU AI Act does not exist in a vacuum. EU product safety law covers medical devices, machinery, vehicles, radio equipment, toys, and dozens of other product categories through sector-specific regulations and directives. Many AI systems are embedded in these products as safety-critical components. When the AI Act entered into force in August 2024, it created a new horizontal compliance obligation that intersects with — and in some cases supersedes — existing sector-specific requirements.

Article 107 is the mechanism through which the EU AI Act formally integrates into this existing legislative framework. It amends other EU legislation to acknowledge the AI Act's authority, clarify which requirements apply when two regulatory regimes overlap, and establish the legal bridges that allow developers, notified bodies, and market surveillance authorities to navigate multi-regulated AI products coherently.

The practical implication for developers: If your AI system is a safety component in a product that is already subject to EU harmonization legislation — a medical imaging algorithm inside an MDR-regulated device, a collision avoidance system inside a vehicle subject to type-approval, a machine learning quality control module inside CE-marked industrial machinery — Art.107 determines how your compliance obligations stack. In some configurations, conformity assessment under one regime reduces your burden under the other. In others, you face independent parallel obligations. Understanding the framework Art.107 creates is not optional for dual-regulated AI developers.

The Cross-Regulatory Integration Problem

EU product regulation is built on a sector-specific model. The Medical Devices Regulation governs medical devices. The Machinery Regulation governs machinery. The Radio Equipment Directive governs radio equipment. Each has its own conformity assessment procedures, notified bodies, technical documentation requirements, and market surveillance frameworks.

The EU AI Act is a horizontal regulation — it applies across all sectors. It does not care whether your AI system is embedded in a medical device, a car, or an industrial robot. If it meets the criteria for being a high-risk AI system under Article 6 and Annex III, the AI Act requirements apply.

The problem this creates: Before Art.107, there was no clear legal answer to questions like:

Art.107 addresses these questions by establishing the legal framework within which Annex I harmonization legislation and the AI Act coexist.

The Annex I Foundation

The key to understanding Art.107 is understanding Annex I to the EU AI Act. Annex I lists the EU harmonization legislation under which an AI system can qualify as a high-risk AI system through the Article 6(1) pathway.

The Article 6(1) pathway: Under Art.6(1), an AI system is high-risk if:

  1. It is a safety component of a product covered by Annex I legislation, OR it is itself a product covered by Annex I legislation, AND
  2. That product — or its safety component — is required to undergo a third-party conformity assessment under the applicable Annex I legislation

This is a distinct pathway from Art.6(2) + Annex III, which covers high-risk AI systems in specified application areas regardless of product categorisation. The Annex I pathway is specifically for AI embedded in regulated products.

What Annex I covers: The EU AI Act's Annex I lists EU harmonization legislation across multiple product categories. The most developer-relevant instruments include:

The significance of Annex I is not just that it creates the Art.6(1) pathway — it is that it identifies the regulatory regimes where Art.107 amendments are most consequential. For each Annex I instrument, the AI Act needed to establish what happens when an AI system must comply with both frameworks.

Medical Devices and AI Act — Dual Compliance Architecture

Medical devices represent the highest-stakes intersection between the EU AI Act and existing EU harmonization legislation. Regulation (EU) 2017/745 (MDR) and Regulation (EU) 2017/746 (IVDR) already require extensive clinical evaluation, technical documentation, quality management systems, and post-market surveillance for AI-based medical devices — obligations that substantially overlap with the AI Act's Art.8-15 requirements.

What Art.107 establishes for MDR/IVDR intersection:

The AI Act recognises that MDR/IVDR conformity assessment already addresses many of the AI Act's core requirements for high-risk AI. Where a notified body has assessed an AI-based medical device under MDR and found it compliant with MDR's safety and performance requirements, this assessment is relevant evidence for AI Act conformity — but it does not automatically substitute for it.

The compliance architecture for AI-based medical devices:

  1. Class IIa, IIb, III devices (MDR) / Class B, C, D (IVDR): These require third-party conformity assessment under their respective regulations, which triggers the Art.6(1) high-risk AI pathway. The AI Act requirements in Art.8-15 apply in addition to MDR/IVDR requirements.

  2. Single notified body option: Where the same notified body is authorised under both MDR and the AI Act for the relevant product category, a single body can perform an integrated assessment. The AI Office has been working with national competent authorities to establish joint assessment protocols.

  3. Documentation consolidation: The technical documentation required under MDR (Annex II) and the AI Act (Art.11 + Annex IV) can be maintained as a single consolidated document structure, with MDR-specific sections and AI Act-specific sections organised to avoid duplication while remaining separable for regulatory review.

  4. QMS alignment: MDR Article 10 requires quality management systems for device manufacturers. The AI Act's Art.9 requires quality management systems for high-risk AI providers. These requirements substantially overlap. MDR-compliant QMS implementations that cover AI lifecycle management satisfy Art.9 requirements without independent AI Act QMS certification — but the AI Act's specific requirements for training data governance (Art.10), human oversight (Art.14), and accuracy/robustness (Art.15) must be explicitly addressed in the QMS documentation.

Developer implication: If you are developing an AI-based medical device, your MDR/IVDR conformity assessment process should be designed from the outset to simultaneously address AI Act requirements. Retrofitting AI Act compliance onto an MDR-certified device is significantly more expensive than building integrated compliance from the start. The key additional obligations the AI Act adds beyond MDR: transparency obligations for AI-generated diagnostic recommendations (Art.13), explicit human oversight mechanisms (Art.14), and specific accuracy/robustness validation against MDR's performance evaluation.

Machinery Regulation and AI Act

Regulation (EU) 2023/1230 — the new Machinery Regulation — replaced the Machinery Directive (2006/42/EC) and introduced significant new provisions for AI-enabled machinery. The Machinery Regulation and EU AI Act entered their implementation periods in overlapping timeframes, and Art.107 establishes how they interact.

The key interaction point: Under the Machinery Regulation, machines with AI-based safety functions must meet the regulation's essential health and safety requirements (EHSRs) for safety-relevant control systems. These include requirements for reliability, fault tolerance, and predictability — which substantially overlap with the AI Act's Art.15 requirements for accuracy, robustness, and cybersecurity.

Conformity assessment under dual regulation:

Machinery with AI safety functions that requires third-party conformity assessment under Annex I of the Machinery Regulation is automatically in scope for Art.6(1) high-risk AI classification. The conformity assessment can be integrated:

  1. Harmonised standards pathway: Where harmonised standards exist under both the Machinery Regulation and the AI Act, compliance with those standards creates a presumption of conformity under both regulations. A machine manufacturer that demonstrates compliance with relevant machinery safety standards and AI Act harmonised standards through a single technical file satisfies both assessment requirements.

  2. Self-declaration for lower-risk machinery: Machinery with AI components that is not in the Machinery Regulation's Annex I categories can use the manufacturer's self-declaration pathway — provided AI Act requirements are documented in the declaration of conformity.

Autonomous and collaborative robots: Collaborative robots (cobots) and autonomous mobile robots (AMRs) represent the highest-density intersection of Machinery Regulation and AI Act requirements. Both regulations require documentation of the AI system's limitations, the conditions under which safety functions may fail, and the human oversight mechanisms in place. For cobot developers, Art.107's cross-regulatory framework means that your robot's technical file must address both sets of requirements — and that your notified body (if required) must be authorised under both the Machinery Regulation and the AI Act for your product category.

Radio Equipment Directive and AI Act

Directive 2014/53/EU (Radio Equipment Directive — RED) covers a broad range of wireless and connected devices. The intersection with the AI Act is particularly relevant for:

The RED-AI Act intersection: RED already requires that radio equipment be constructed to protect privacy, health, and the safety of users. Recent RED amendments specifically extended these requirements to devices that collect and process personal data — a category that includes most AI-powered consumer devices.

The AI Act does not generally classify consumer AI assistants in smart home devices as high-risk (unless they fall within specific Annex III categories). But for RED-covered devices that do fall within Annex III — for example, AI-based health monitoring wearables that make diagnostic assessments — the dual compliance architecture applies.

Developer implication for RED-covered AI devices:

Vehicle Type-Approval and AI Act

Regulation (EU) 2018/858 (vehicle type-approval) and related regulations including Regulation (EU) 2019/2144 (General Safety Regulation for vehicles) directly intersect with the AI Act for automotive AI systems.

The automotive AI Act landscape: AI systems in vehicles that are safety components — ADAS (Advanced Driver Assistance Systems), autonomous emergency braking, lane-keeping systems, adaptive cruise control — are classified as high-risk AI under Annex III, point 3 (management and operation of critical infrastructure), or under the General Safety Regulation's mandatory ADAS requirements.

Type-approval and AI Act integration:

The EU type-approval framework for vehicles operates through UN Economic Commission for Europe (UN/ECE) regulations (particularly UN Regulation No. 155 on Cybersecurity and UN Regulation No. 156 on Software Update Management) that are incorporated into EU law. The AI Act's requirements for ADAS and autonomous driving AI overlap significantly with UN/ECE requirements.

Art.107's amendment of Regulation (EU) 2018/858 establishes that:

  1. AI systems in vehicles that are safety components subject to type-approval undergo conformity assessment within the type-approval process
  2. The technical documentation produced for type-approval (including ADAS validation documentation) satisfies a substantial part of the AI Act's Art.11 technical documentation requirement
  3. Post-market monitoring obligations under the AI Act (Art.72) align with the ongoing monitoring required under type-approval frameworks

What this means for automotive AI developers:

If you are developing ADAS or autonomous driving AI for vehicles subject to EU type-approval, your compliance pathway runs through the type-approval process. You do not face a parallel, independent AI Act conformity assessment process. Instead:

For automotive AI suppliers (Tier 1, Tier 2) who supply AI components to OEMs rather than selling directly to consumers, the obligation structure is different: you are an AI system provider under the AI Act but typically not the vehicle manufacturer for type-approval purposes. Your obligations include: technical documentation for the AI component, cooperation with the OEM on conformity assessment, and post-market monitoring of AI performance within the vehicle system.

Conformity Assessment Interaction

The central question Art.107 addresses for dual-regulated AI systems is: how do conformity assessment obligations interact across two regulatory regimes?

Three scenarios:

Scenario 1: Integrated assessment. Where a notified body is authorised under both the sector-specific regulation and the AI Act for the same product category, a single integrated conformity assessment procedure is possible. The assessment covers both sets of requirements in a single audit, producing a single assessment report that satisfies both regimes. This is the most efficient pathway and the one the Commission is encouraging notified bodies to develop capacity for.

Scenario 2: Sequential assessment. Where different bodies are authorised for each regime, or where the developer chooses to use separate bodies, two sequential assessments occur. The output of the first assessment (e.g., MDR assessment) becomes an input to the second (AI Act assessment). This avoids duplication but requires clear documentation of what each assessment covers.

Scenario 3: Presumption of conformity. For AI systems covered by harmonised standards adopted under both the sector-specific regulation and the AI Act, compliance with those standards creates a presumption of conformity under both. This pathway is available where applicable harmonised standards exist — and is being developed for key sectors including medical devices (through the work of CEN/CENELEC) and industrial machinery.

What conformity assessment cannot do: Sector-specific conformity assessment cannot substitute for AI Act requirements that have no counterpart in the sector regulation. AI Act-specific requirements that always require independent assessment include:

CE Marking for Dual-Regulated AI

When an AI system is subject to both the EU AI Act and sector-specific EU harmonization legislation, the CE marking requirement covers both. CE marking is a composite declaration — when you affix the CE mark to a product, you declare compliance with all applicable EU harmonization legislation.

Declaration of Conformity (DoC) structure for dual-regulated AI:

The DoC for a dual-regulated AI product must reference all applicable EU instruments. For an AI-based medical device, the DoC references:

The harmonised standards referenced in the DoC should cover both MDR requirements and AI Act requirements where applicable standards exist. Where AI Act harmonised standards are not yet published (which is the case for many requirements as of 2026), the DoC notes that the relevant AI Act requirements have been assessed using internal procedures.

Technical file vs. Declaration of Conformity: The technical file (maintained internally, not submitted to authorities unless requested) can be a single consolidated document. The Declaration of Conformity is the public-facing document that references applicable legislation. Both must reflect the dual-regulated status.

CLOUD Act Intersection

AI systems that are dual-regulated under both the EU AI Act and sector-specific EU legislation typically generate substantial technical documentation: validation datasets, conformity assessment reports, clinical evaluation records (for medical devices), type-approval documentation (for vehicles), QMS records, incident reports. All of this documentation is potentially subject to the US CLOUD Act if stored on US-cloud infrastructure.

The compellability risk for dual-regulated AI:

These document types are both commercially sensitive and — in the case of medical and automotive data — contain personally identifiable information. Storage on EU-sovereign infrastructure is the recommended approach for dual-regulated AI technical files.

Key principle: Conformity assessment records for dual-regulated AI must be retained for the longer of the two retention periods required by applicable regulation. MDR requires 15 years (Class III devices) or 10 years (other classes). The AI Act requires retention for the post-market monitoring period. The longer period governs.

Python: DualRegulationTracker

from dataclasses import dataclass, field
from typing import Optional
from enum import Enum
from datetime import date

class RegulatoryRegime(Enum):
    MDR = "Regulation (EU) 2017/745 (Medical Devices)"
    IVDR = "Regulation (EU) 2017/746 (IVD)"
    MACHINERY = "Regulation (EU) 2023/1230 (Machinery)"
    RED = "Directive 2014/53/EU (Radio Equipment)"
    VEHICLE_TYPE_APPROVAL = "Regulation (EU) 2018/858 (Type-Approval)"
    TOY_SAFETY = "Directive 2009/48/EC (Toys)"
    PPE = "Regulation (EU) 2016/425 (PPE)"

class AIActHighRiskPathway(Enum):
    ANNEX_I = "Art.6(1) — Annex I product safety component"
    ANNEX_III = "Art.6(2) — Annex III application area"
    BOTH = "Art.6(1) + Art.6(2) — dual pathway"
    NOT_HIGH_RISK = "Not classified as high-risk"

@dataclass
class DualRegulationProfile:
    system_name: str
    sector_regime: RegulatoryRegime
    ai_act_pathway: AIActHighRiskPathway
    requires_third_party_sector_assessment: bool
    requires_third_party_ai_act_assessment: bool
    integrated_assessment_available: bool
    doc_retention_years: int
    notes: list[str] = field(default_factory=list)

    def compliance_complexity_score(self) -> int:
        """1 (low) to 5 (high) complexity rating"""
        score = 1
        if self.requires_third_party_sector_assessment:
            score += 1
        if self.requires_third_party_ai_act_assessment:
            score += 1
        if not self.integrated_assessment_available:
            score += 1
        if self.ai_act_pathway == AIActHighRiskPathway.BOTH:
            score += 1
        return min(score, 5)

def build_dual_regulation_profile(
    system_name: str,
    sector_regime: RegulatoryRegime,
    requires_third_party_sector: bool,
    annex_iii_application: bool = False,
) -> DualRegulationProfile:
    """
    Build a dual-regulation compliance profile for an AI system
    subject to both a sector-specific regulation and the EU AI Act.
    """
    # Determine AI Act high-risk pathway
    if requires_third_party_sector and annex_iii_application:
        pathway = AIActHighRiskPathway.BOTH
    elif requires_third_party_sector:
        pathway = AIActHighRiskPathway.ANNEX_I
    elif annex_iii_application:
        pathway = AIActHighRiskPathway.ANNEX_III
    else:
        pathway = AIActHighRiskPathway.NOT_HIGH_RISK

    # Determine if third-party AI Act assessment is required
    # Art.43(2): third-party assessment for Annex I products with third-party sector assessment
    requires_ai_act_third_party = (
        pathway in (AIActHighRiskPathway.ANNEX_I, AIActHighRiskPathway.BOTH)
        and requires_third_party_sector
    )

    # Integrated assessment availability by sector
    integrated = {
        RegulatoryRegime.MDR: True,   # Coordinated notified body programmes developing
        RegulatoryRegime.IVDR: True,
        RegulatoryRegime.MACHINERY: True,
        RegulatoryRegime.RED: False,  # Less mature integration frameworks
        RegulatoryRegime.VEHICLE_TYPE_APPROVAL: True,  # Via type-approval process
        RegulatoryRegime.TOY_SAFETY: False,
        RegulatoryRegime.PPE: False,
    }.get(sector_regime, False)

    # Retention period by sector (years)
    retention = {
        RegulatoryRegime.MDR: 15,
        RegulatoryRegime.IVDR: 15,
        RegulatoryRegime.MACHINERY: 10,
        RegulatoryRegime.RED: 10,
        RegulatoryRegime.VEHICLE_TYPE_APPROVAL: 10,
        RegulatoryRegime.TOY_SAFETY: 10,
        RegulatoryRegime.PPE: 10,
    }.get(sector_regime, 10)

    # AI Act minimum retention: 10 years (Art.18)
    retention = max(retention, 10)

    notes = []
    if pathway == AIActHighRiskPathway.NOT_HIGH_RISK:
        notes.append("System may still face Art.50 transparency obligations")
    if not integrated:
        notes.append("No integrated assessment framework available — plan for sequential assessments")
    if pathway == AIActHighRiskPathway.BOTH:
        notes.append("Both Art.6(1) and Art.6(2) pathways apply — higher documentation burden")

    return DualRegulationProfile(
        system_name=system_name,
        sector_regime=sector_regime,
        ai_act_pathway=pathway,
        requires_third_party_sector_assessment=requires_third_party_sector,
        requires_third_party_ai_act_assessment=requires_ai_act_third_party,
        integrated_assessment_available=integrated,
        doc_retention_years=retention,
        notes=notes,
    )

def check_annex_i_coverage(sector_regime: RegulatoryRegime) -> dict:
    """Check which AI Act articles have Annex I specific provisions"""
    coverage = {
        "high_risk_pathway": "Art.6(1) — triggered if third-party assessment required",
        "conformity_assessment": "Art.43(2) — third-party assessment required if sector regulation requires it",
        "technical_documentation": "Art.11 — can be consolidated with sector-specific tech file",
        "qms": "Art.9 — can align with sector QMS (MDR Art.10, Machinery Regulation Annex IX)",
        "post_market_monitoring": "Art.72 — aligns with sector post-market surveillance obligations",
        "ce_marking": "CE mark covers both regimes — DoC must reference both instruments",
    }
    if sector_regime == RegulatoryRegime.MDR:
        coverage["specific_note"] = "MDR clinical evaluation satisfies Art.9 risk management aspects; Art.13/14 require additional AI-specific documentation"
    elif sector_regime == RegulatoryRegime.VEHICLE_TYPE_APPROVAL:
        coverage["specific_note"] = "Type-approval process subsumes AI Act conformity assessment for vehicle AI safety components"
    return coverage

# --- Example usage ---

if __name__ == "__main__":
    # AI-based medical imaging diagnostic system
    imaging_ai = build_dual_regulation_profile(
        system_name="RadiologyAI Diagnostic Module",
        sector_regime=RegulatoryRegime.MDR,
        requires_third_party_sector=True,  # Class IIb device
        annex_iii_application=False,
    )
    print(f"System: {imaging_ai.system_name}")
    print(f"Pathway: {imaging_ai.ai_act_pathway.value}")
    print(f"Third-party AI Act assessment: {imaging_ai.requires_third_party_ai_act_assessment}")
    print(f"Integrated assessment available: {imaging_ai.integrated_assessment_available}")
    print(f"Complexity score: {imaging_ai.compliance_complexity_score()}/5")
    print(f"Doc retention: {imaging_ai.doc_retention_years} years")
    print()

    # Automotive ADAS system
    adas = build_dual_regulation_profile(
        system_name="AutoPilot Lane Keeping System",
        sector_regime=RegulatoryRegime.VEHICLE_TYPE_APPROVAL,
        requires_third_party_sector=True,
        annex_iii_application=False,
    )
    print(f"System: {adas.system_name}")
    print(f"Pathway: {adas.ai_act_pathway.value}")
    print(f"Complexity score: {adas.compliance_complexity_score()}/5")
    print()

    # Check Annex I coverage for MDR
    coverage = check_annex_i_coverage(RegulatoryRegime.MDR)
    print("MDR + AI Act Annex I coverage map:")
    for k, v in coverage.items():
        print(f"  {k}: {v}")

30-Item Art.107 Cross-Regulatory Readiness Checklist

Sector Identification (1–5)

Dual Compliance Architecture (6–10)

Technical Documentation (11–15)

Quality Management System (16–18)

CE Marking and Declaration of Conformity (19–21)

Post-Market Monitoring (22–24)

Supply Chain and Responsibility (25–27)

CLOUD Act and Data Sovereignty (28–30)

Further Reading