EU AI Act Art.107: Amendments to Other EU Legislation — Cross-Regulatory Integration Developer Guide (2026)
The EU AI Act does not exist in a vacuum. EU product safety law covers medical devices, machinery, vehicles, radio equipment, toys, and dozens of other product categories through sector-specific regulations and directives. Many AI systems are embedded in these products as safety-critical components. When the AI Act entered into force in August 2024, it created a new horizontal compliance obligation that intersects with — and in some cases supersedes — existing sector-specific requirements.
Article 107 is the mechanism through which the EU AI Act formally integrates into this existing legislative framework. It amends other EU legislation to acknowledge the AI Act's authority, clarify which requirements apply when two regulatory regimes overlap, and establish the legal bridges that allow developers, notified bodies, and market surveillance authorities to navigate multi-regulated AI products coherently.
The practical implication for developers: If your AI system is a safety component in a product that is already subject to EU harmonization legislation — a medical imaging algorithm inside an MDR-regulated device, a collision avoidance system inside a vehicle subject to type-approval, a machine learning quality control module inside CE-marked industrial machinery — Art.107 determines how your compliance obligations stack. In some configurations, conformity assessment under one regime reduces your burden under the other. In others, you face independent parallel obligations. Understanding the framework Art.107 creates is not optional for dual-regulated AI developers.
The Cross-Regulatory Integration Problem
EU product regulation is built on a sector-specific model. The Medical Devices Regulation governs medical devices. The Machinery Regulation governs machinery. The Radio Equipment Directive governs radio equipment. Each has its own conformity assessment procedures, notified bodies, technical documentation requirements, and market surveillance frameworks.
The EU AI Act is a horizontal regulation — it applies across all sectors. It does not care whether your AI system is embedded in a medical device, a car, or an industrial robot. If it meets the criteria for being a high-risk AI system under Article 6 and Annex III, the AI Act requirements apply.
The problem this creates: Before Art.107, there was no clear legal answer to questions like:
- If I have already passed MDR conformity assessment for my AI-based medical device, do I also need to pass AI Act conformity assessment?
- Can a single notified body perform both assessments, or do I need two?
- If my sector-specific regulation requires a quality management system, does that satisfy the AI Act's Art.9 QMS requirement?
- When CE marking is required under both the sector regulation and the AI Act, whose conformity declaration takes precedence?
Art.107 addresses these questions by establishing the legal framework within which Annex I harmonization legislation and the AI Act coexist.
The Annex I Foundation
The key to understanding Art.107 is understanding Annex I to the EU AI Act. Annex I lists the EU harmonization legislation under which an AI system can qualify as a high-risk AI system through the Article 6(1) pathway.
The Article 6(1) pathway: Under Art.6(1), an AI system is high-risk if:
- It is a safety component of a product covered by Annex I legislation, OR it is itself a product covered by Annex I legislation, AND
- That product — or its safety component — is required to undergo a third-party conformity assessment under the applicable Annex I legislation
This is a distinct pathway from Art.6(2) + Annex III, which covers high-risk AI systems in specified application areas regardless of product categorisation. The Annex I pathway is specifically for AI embedded in regulated products.
What Annex I covers: The EU AI Act's Annex I lists EU harmonization legislation across multiple product categories. The most developer-relevant instruments include:
- Regulation (EU) 2017/745 (Medical Devices Regulation — MDR) — AI-based diagnostic systems, surgical robots, patient monitoring AI
- Regulation (EU) 2017/746 (In Vitro Diagnostics Regulation — IVDR) — AI-driven diagnostic algorithms, companion diagnostic AI
- Regulation (EU) 2023/1230 (Machinery Regulation) — AI safety systems in industrial machinery, collaborative robots, autonomous manufacturing equipment
- Directive 2014/53/EU (Radio Equipment Directive — RED) — AI systems in connected devices and wireless equipment
- Regulation (EU) 2018/858 (Vehicle Type-Approval) — ADAS, autonomous driving components, AI safety systems in motor vehicles
- Directive 2009/48/EC (Toy Safety Directive) — AI in toys that interact with children
- Regulation (EU) 2016/425 (Personal Protective Equipment Regulation) — AI-enhanced safety equipment
The significance of Annex I is not just that it creates the Art.6(1) pathway — it is that it identifies the regulatory regimes where Art.107 amendments are most consequential. For each Annex I instrument, the AI Act needed to establish what happens when an AI system must comply with both frameworks.
Medical Devices and AI Act — Dual Compliance Architecture
Medical devices represent the highest-stakes intersection between the EU AI Act and existing EU harmonization legislation. Regulation (EU) 2017/745 (MDR) and Regulation (EU) 2017/746 (IVDR) already require extensive clinical evaluation, technical documentation, quality management systems, and post-market surveillance for AI-based medical devices — obligations that substantially overlap with the AI Act's Art.8-15 requirements.
What Art.107 establishes for MDR/IVDR intersection:
The AI Act recognises that MDR/IVDR conformity assessment already addresses many of the AI Act's core requirements for high-risk AI. Where a notified body has assessed an AI-based medical device under MDR and found it compliant with MDR's safety and performance requirements, this assessment is relevant evidence for AI Act conformity — but it does not automatically substitute for it.
The compliance architecture for AI-based medical devices:
-
Class IIa, IIb, III devices (MDR) / Class B, C, D (IVDR): These require third-party conformity assessment under their respective regulations, which triggers the Art.6(1) high-risk AI pathway. The AI Act requirements in Art.8-15 apply in addition to MDR/IVDR requirements.
-
Single notified body option: Where the same notified body is authorised under both MDR and the AI Act for the relevant product category, a single body can perform an integrated assessment. The AI Office has been working with national competent authorities to establish joint assessment protocols.
-
Documentation consolidation: The technical documentation required under MDR (Annex II) and the AI Act (Art.11 + Annex IV) can be maintained as a single consolidated document structure, with MDR-specific sections and AI Act-specific sections organised to avoid duplication while remaining separable for regulatory review.
-
QMS alignment: MDR Article 10 requires quality management systems for device manufacturers. The AI Act's Art.9 requires quality management systems for high-risk AI providers. These requirements substantially overlap. MDR-compliant QMS implementations that cover AI lifecycle management satisfy Art.9 requirements without independent AI Act QMS certification — but the AI Act's specific requirements for training data governance (Art.10), human oversight (Art.14), and accuracy/robustness (Art.15) must be explicitly addressed in the QMS documentation.
Developer implication: If you are developing an AI-based medical device, your MDR/IVDR conformity assessment process should be designed from the outset to simultaneously address AI Act requirements. Retrofitting AI Act compliance onto an MDR-certified device is significantly more expensive than building integrated compliance from the start. The key additional obligations the AI Act adds beyond MDR: transparency obligations for AI-generated diagnostic recommendations (Art.13), explicit human oversight mechanisms (Art.14), and specific accuracy/robustness validation against MDR's performance evaluation.
Machinery Regulation and AI Act
Regulation (EU) 2023/1230 — the new Machinery Regulation — replaced the Machinery Directive (2006/42/EC) and introduced significant new provisions for AI-enabled machinery. The Machinery Regulation and EU AI Act entered their implementation periods in overlapping timeframes, and Art.107 establishes how they interact.
The key interaction point: Under the Machinery Regulation, machines with AI-based safety functions must meet the regulation's essential health and safety requirements (EHSRs) for safety-relevant control systems. These include requirements for reliability, fault tolerance, and predictability — which substantially overlap with the AI Act's Art.15 requirements for accuracy, robustness, and cybersecurity.
Conformity assessment under dual regulation:
Machinery with AI safety functions that requires third-party conformity assessment under Annex I of the Machinery Regulation is automatically in scope for Art.6(1) high-risk AI classification. The conformity assessment can be integrated:
-
Harmonised standards pathway: Where harmonised standards exist under both the Machinery Regulation and the AI Act, compliance with those standards creates a presumption of conformity under both regulations. A machine manufacturer that demonstrates compliance with relevant machinery safety standards and AI Act harmonised standards through a single technical file satisfies both assessment requirements.
-
Self-declaration for lower-risk machinery: Machinery with AI components that is not in the Machinery Regulation's Annex I categories can use the manufacturer's self-declaration pathway — provided AI Act requirements are documented in the declaration of conformity.
Autonomous and collaborative robots: Collaborative robots (cobots) and autonomous mobile robots (AMRs) represent the highest-density intersection of Machinery Regulation and AI Act requirements. Both regulations require documentation of the AI system's limitations, the conditions under which safety functions may fail, and the human oversight mechanisms in place. For cobot developers, Art.107's cross-regulatory framework means that your robot's technical file must address both sets of requirements — and that your notified body (if required) must be authorised under both the Machinery Regulation and the AI Act for your product category.
Radio Equipment Directive and AI Act
Directive 2014/53/EU (Radio Equipment Directive — RED) covers a broad range of wireless and connected devices. The intersection with the AI Act is particularly relevant for:
- AI-powered consumer electronics with wireless connectivity
- Smart home devices with embedded AI (cameras, speakers, sensors)
- Industrial IoT devices with AI-based analytics
- Wearable health devices with AI features and wireless data transmission
The RED-AI Act intersection: RED already requires that radio equipment be constructed to protect privacy, health, and the safety of users. Recent RED amendments specifically extended these requirements to devices that collect and process personal data — a category that includes most AI-powered consumer devices.
The AI Act does not generally classify consumer AI assistants in smart home devices as high-risk (unless they fall within specific Annex III categories). But for RED-covered devices that do fall within Annex III — for example, AI-based health monitoring wearables that make diagnostic assessments — the dual compliance architecture applies.
Developer implication for RED-covered AI devices:
- Check whether your device's AI function brings it within any Annex III category (especially essential private services access — Art.6(2) point (e) — which covers AI for insurance, banking, and health benefit assessment)
- If Annex III applies, coordinate conformity assessment: RED conformity assessment typically uses EU-harmonised standards (ETSI, etc.) while AI Act conformity assessment follows the Art.43 pathway
- Declaration of conformity must reference both the RED and the AI Act where both apply
- CE marking covers both — CE mark on a device indicates compliance with all applicable EU harmonization legislation
Vehicle Type-Approval and AI Act
Regulation (EU) 2018/858 (vehicle type-approval) and related regulations including Regulation (EU) 2019/2144 (General Safety Regulation for vehicles) directly intersect with the AI Act for automotive AI systems.
The automotive AI Act landscape: AI systems in vehicles that are safety components — ADAS (Advanced Driver Assistance Systems), autonomous emergency braking, lane-keeping systems, adaptive cruise control — are classified as high-risk AI under Annex III, point 3 (management and operation of critical infrastructure), or under the General Safety Regulation's mandatory ADAS requirements.
Type-approval and AI Act integration:
The EU type-approval framework for vehicles operates through UN Economic Commission for Europe (UN/ECE) regulations (particularly UN Regulation No. 155 on Cybersecurity and UN Regulation No. 156 on Software Update Management) that are incorporated into EU law. The AI Act's requirements for ADAS and autonomous driving AI overlap significantly with UN/ECE requirements.
Art.107's amendment of Regulation (EU) 2018/858 establishes that:
- AI systems in vehicles that are safety components subject to type-approval undergo conformity assessment within the type-approval process
- The technical documentation produced for type-approval (including ADAS validation documentation) satisfies a substantial part of the AI Act's Art.11 technical documentation requirement
- Post-market monitoring obligations under the AI Act (Art.72) align with the ongoing monitoring required under type-approval frameworks
What this means for automotive AI developers:
If you are developing ADAS or autonomous driving AI for vehicles subject to EU type-approval, your compliance pathway runs through the type-approval process. You do not face a parallel, independent AI Act conformity assessment process. Instead:
- Type-approval documentation must be structured to also address AI Act requirements
- The OEM's conformity of production (CoP) process addresses the AI Act's Art.17 post-market monitoring obligations
- Software updates (OTA and otherwise) are governed by UN R156 for type-approval and must also be assessed against the AI Act's Art.72 monitoring framework
For automotive AI suppliers (Tier 1, Tier 2) who supply AI components to OEMs rather than selling directly to consumers, the obligation structure is different: you are an AI system provider under the AI Act but typically not the vehicle manufacturer for type-approval purposes. Your obligations include: technical documentation for the AI component, cooperation with the OEM on conformity assessment, and post-market monitoring of AI performance within the vehicle system.
Conformity Assessment Interaction
The central question Art.107 addresses for dual-regulated AI systems is: how do conformity assessment obligations interact across two regulatory regimes?
Three scenarios:
Scenario 1: Integrated assessment. Where a notified body is authorised under both the sector-specific regulation and the AI Act for the same product category, a single integrated conformity assessment procedure is possible. The assessment covers both sets of requirements in a single audit, producing a single assessment report that satisfies both regimes. This is the most efficient pathway and the one the Commission is encouraging notified bodies to develop capacity for.
Scenario 2: Sequential assessment. Where different bodies are authorised for each regime, or where the developer chooses to use separate bodies, two sequential assessments occur. The output of the first assessment (e.g., MDR assessment) becomes an input to the second (AI Act assessment). This avoids duplication but requires clear documentation of what each assessment covers.
Scenario 3: Presumption of conformity. For AI systems covered by harmonised standards adopted under both the sector-specific regulation and the AI Act, compliance with those standards creates a presumption of conformity under both. This pathway is available where applicable harmonised standards exist — and is being developed for key sectors including medical devices (through the work of CEN/CENELEC) and industrial machinery.
What conformity assessment cannot do: Sector-specific conformity assessment cannot substitute for AI Act requirements that have no counterpart in the sector regulation. AI Act-specific requirements that always require independent assessment include:
- Art.10(5-6): Training data governance documentation (no equivalent in most sector regulations)
- Art.13(1): Transparency obligation — AI system output must be interpretable by the intended user (medical device clinical evaluation does not fully satisfy this)
- Art.14: Human oversight measures — specific AI Act requirements for human intervention points and override capability
- Art.13(3)(b): Instructions for use covering AI-specific limitations and foreseeable misuse scenarios
CE Marking for Dual-Regulated AI
When an AI system is subject to both the EU AI Act and sector-specific EU harmonization legislation, the CE marking requirement covers both. CE marking is a composite declaration — when you affix the CE mark to a product, you declare compliance with all applicable EU harmonization legislation.
Declaration of Conformity (DoC) structure for dual-regulated AI:
The DoC for a dual-regulated AI product must reference all applicable EU instruments. For an AI-based medical device, the DoC references:
- Regulation (EU) 2017/745 (MDR)
- Regulation (EU) 2024/1689 (EU AI Act)
- Any other applicable directives (e.g., Directive 2014/35/EU low voltage, if relevant)
The harmonised standards referenced in the DoC should cover both MDR requirements and AI Act requirements where applicable standards exist. Where AI Act harmonised standards are not yet published (which is the case for many requirements as of 2026), the DoC notes that the relevant AI Act requirements have been assessed using internal procedures.
Technical file vs. Declaration of Conformity: The technical file (maintained internally, not submitted to authorities unless requested) can be a single consolidated document. The Declaration of Conformity is the public-facing document that references applicable legislation. Both must reflect the dual-regulated status.
CLOUD Act Intersection
AI systems that are dual-regulated under both the EU AI Act and sector-specific EU legislation typically generate substantial technical documentation: validation datasets, conformity assessment reports, clinical evaluation records (for medical devices), type-approval documentation (for vehicles), QMS records, incident reports. All of this documentation is potentially subject to the US CLOUD Act if stored on US-cloud infrastructure.
The compellability risk for dual-regulated AI:
- MDR/IVDR clinical evaluation data that includes patient information
- Automotive AI validation data from public road testing
- Industrial safety incident reports from machinery AI systems
These document types are both commercially sensitive and — in the case of medical and automotive data — contain personally identifiable information. Storage on EU-sovereign infrastructure is the recommended approach for dual-regulated AI technical files.
Key principle: Conformity assessment records for dual-regulated AI must be retained for the longer of the two retention periods required by applicable regulation. MDR requires 15 years (Class III devices) or 10 years (other classes). The AI Act requires retention for the post-market monitoring period. The longer period governs.
Python: DualRegulationTracker
from dataclasses import dataclass, field
from typing import Optional
from enum import Enum
from datetime import date
class RegulatoryRegime(Enum):
MDR = "Regulation (EU) 2017/745 (Medical Devices)"
IVDR = "Regulation (EU) 2017/746 (IVD)"
MACHINERY = "Regulation (EU) 2023/1230 (Machinery)"
RED = "Directive 2014/53/EU (Radio Equipment)"
VEHICLE_TYPE_APPROVAL = "Regulation (EU) 2018/858 (Type-Approval)"
TOY_SAFETY = "Directive 2009/48/EC (Toys)"
PPE = "Regulation (EU) 2016/425 (PPE)"
class AIActHighRiskPathway(Enum):
ANNEX_I = "Art.6(1) — Annex I product safety component"
ANNEX_III = "Art.6(2) — Annex III application area"
BOTH = "Art.6(1) + Art.6(2) — dual pathway"
NOT_HIGH_RISK = "Not classified as high-risk"
@dataclass
class DualRegulationProfile:
system_name: str
sector_regime: RegulatoryRegime
ai_act_pathway: AIActHighRiskPathway
requires_third_party_sector_assessment: bool
requires_third_party_ai_act_assessment: bool
integrated_assessment_available: bool
doc_retention_years: int
notes: list[str] = field(default_factory=list)
def compliance_complexity_score(self) -> int:
"""1 (low) to 5 (high) complexity rating"""
score = 1
if self.requires_third_party_sector_assessment:
score += 1
if self.requires_third_party_ai_act_assessment:
score += 1
if not self.integrated_assessment_available:
score += 1
if self.ai_act_pathway == AIActHighRiskPathway.BOTH:
score += 1
return min(score, 5)
def build_dual_regulation_profile(
system_name: str,
sector_regime: RegulatoryRegime,
requires_third_party_sector: bool,
annex_iii_application: bool = False,
) -> DualRegulationProfile:
"""
Build a dual-regulation compliance profile for an AI system
subject to both a sector-specific regulation and the EU AI Act.
"""
# Determine AI Act high-risk pathway
if requires_third_party_sector and annex_iii_application:
pathway = AIActHighRiskPathway.BOTH
elif requires_third_party_sector:
pathway = AIActHighRiskPathway.ANNEX_I
elif annex_iii_application:
pathway = AIActHighRiskPathway.ANNEX_III
else:
pathway = AIActHighRiskPathway.NOT_HIGH_RISK
# Determine if third-party AI Act assessment is required
# Art.43(2): third-party assessment for Annex I products with third-party sector assessment
requires_ai_act_third_party = (
pathway in (AIActHighRiskPathway.ANNEX_I, AIActHighRiskPathway.BOTH)
and requires_third_party_sector
)
# Integrated assessment availability by sector
integrated = {
RegulatoryRegime.MDR: True, # Coordinated notified body programmes developing
RegulatoryRegime.IVDR: True,
RegulatoryRegime.MACHINERY: True,
RegulatoryRegime.RED: False, # Less mature integration frameworks
RegulatoryRegime.VEHICLE_TYPE_APPROVAL: True, # Via type-approval process
RegulatoryRegime.TOY_SAFETY: False,
RegulatoryRegime.PPE: False,
}.get(sector_regime, False)
# Retention period by sector (years)
retention = {
RegulatoryRegime.MDR: 15,
RegulatoryRegime.IVDR: 15,
RegulatoryRegime.MACHINERY: 10,
RegulatoryRegime.RED: 10,
RegulatoryRegime.VEHICLE_TYPE_APPROVAL: 10,
RegulatoryRegime.TOY_SAFETY: 10,
RegulatoryRegime.PPE: 10,
}.get(sector_regime, 10)
# AI Act minimum retention: 10 years (Art.18)
retention = max(retention, 10)
notes = []
if pathway == AIActHighRiskPathway.NOT_HIGH_RISK:
notes.append("System may still face Art.50 transparency obligations")
if not integrated:
notes.append("No integrated assessment framework available — plan for sequential assessments")
if pathway == AIActHighRiskPathway.BOTH:
notes.append("Both Art.6(1) and Art.6(2) pathways apply — higher documentation burden")
return DualRegulationProfile(
system_name=system_name,
sector_regime=sector_regime,
ai_act_pathway=pathway,
requires_third_party_sector_assessment=requires_third_party_sector,
requires_third_party_ai_act_assessment=requires_ai_act_third_party,
integrated_assessment_available=integrated,
doc_retention_years=retention,
notes=notes,
)
def check_annex_i_coverage(sector_regime: RegulatoryRegime) -> dict:
"""Check which AI Act articles have Annex I specific provisions"""
coverage = {
"high_risk_pathway": "Art.6(1) — triggered if third-party assessment required",
"conformity_assessment": "Art.43(2) — third-party assessment required if sector regulation requires it",
"technical_documentation": "Art.11 — can be consolidated with sector-specific tech file",
"qms": "Art.9 — can align with sector QMS (MDR Art.10, Machinery Regulation Annex IX)",
"post_market_monitoring": "Art.72 — aligns with sector post-market surveillance obligations",
"ce_marking": "CE mark covers both regimes — DoC must reference both instruments",
}
if sector_regime == RegulatoryRegime.MDR:
coverage["specific_note"] = "MDR clinical evaluation satisfies Art.9 risk management aspects; Art.13/14 require additional AI-specific documentation"
elif sector_regime == RegulatoryRegime.VEHICLE_TYPE_APPROVAL:
coverage["specific_note"] = "Type-approval process subsumes AI Act conformity assessment for vehicle AI safety components"
return coverage
# --- Example usage ---
if __name__ == "__main__":
# AI-based medical imaging diagnostic system
imaging_ai = build_dual_regulation_profile(
system_name="RadiologyAI Diagnostic Module",
sector_regime=RegulatoryRegime.MDR,
requires_third_party_sector=True, # Class IIb device
annex_iii_application=False,
)
print(f"System: {imaging_ai.system_name}")
print(f"Pathway: {imaging_ai.ai_act_pathway.value}")
print(f"Third-party AI Act assessment: {imaging_ai.requires_third_party_ai_act_assessment}")
print(f"Integrated assessment available: {imaging_ai.integrated_assessment_available}")
print(f"Complexity score: {imaging_ai.compliance_complexity_score()}/5")
print(f"Doc retention: {imaging_ai.doc_retention_years} years")
print()
# Automotive ADAS system
adas = build_dual_regulation_profile(
system_name="AutoPilot Lane Keeping System",
sector_regime=RegulatoryRegime.VEHICLE_TYPE_APPROVAL,
requires_third_party_sector=True,
annex_iii_application=False,
)
print(f"System: {adas.system_name}")
print(f"Pathway: {adas.ai_act_pathway.value}")
print(f"Complexity score: {adas.compliance_complexity_score()}/5")
print()
# Check Annex I coverage for MDR
coverage = check_annex_i_coverage(RegulatoryRegime.MDR)
print("MDR + AI Act Annex I coverage map:")
for k, v in coverage.items():
print(f" {k}: {v}")
30-Item Art.107 Cross-Regulatory Readiness Checklist
Sector Identification (1–5)
- 1. Identify all EU harmonization legislation that applies to the product category (MDR, Machinery, RED, vehicles, etc.)
- 2. Confirm whether each applicable instrument is listed in Annex I to the EU AI Act
- 3. Determine whether your AI system is a safety component of an Annex I product or is itself an Annex I product
- 4. Identify whether third-party conformity assessment is required under the applicable Annex I legislation
- 5. Map the applicable Annex III categories for your AI application area (Art.6(2) pathway, if any)
Dual Compliance Architecture (6–10)
- 6. Determine whether Art.6(1) alone, Art.6(2) alone, or both pathways classify your system as high-risk
- 7. Identify whether an integrated conformity assessment pathway is available for your sector + AI Act combination
- 8. Select notified body authorised under both the sector-specific regulation and the AI Act (where integrated assessment is pursued)
- 9. Map the requirements of each regulatory regime against each other — identify overlaps and gaps
- 10. Document explicitly which AI Act requirements (Art.8–15) are addressed by sector-specific compliance and which require additional measures
Technical Documentation (11–15)
- 11. Design a consolidated technical file structure covering both sector-specific and AI Act requirements
- 12. Include AI Act-specific sections required by Annex IV: training data governance, accuracy/robustness validation, human oversight mechanisms
- 13. Ensure technical documentation references applicable EU harmonised standards under both regimes
- 14. Document the training data management approach explicitly (Art.10) — no sector-specific regulation provides an equivalent requirement that substitutes
- 15. Include AI system transparency documentation (Art.13(3)) covering AI-specific limitations not typically required in sector technical files
Quality Management System (16–18)
- 16. Audit your existing sector-specific QMS for AI Act Art.9 compliance — MDR Art.10 QMS and Machinery Regulation Annex IX QMS substantially overlap
- 17. Add AI Act-required QMS elements not covered by sector QMS: training pipeline oversight, dataset version control, post-market AI performance monitoring
- 18. Align QMS procedures with both sector-specific and AI Act post-market surveillance timelines and reporting obligations
CE Marking and Declaration of Conformity (19–21)
- 19. Reference all applicable EU harmonization legislation in your Declaration of Conformity
- 20. Reference AI Act harmonised standards in the DoC where they exist; note internal assessment procedures where standards are not yet available
- 21. Verify that your CE marking process covers all applicable instruments — a CE mark that omits the AI Act reference is incomplete for dual-regulated products
Post-Market Monitoring (22–24)
- 22. Align AI Act Art.72 post-market monitoring plan with sector-specific post-market surveillance obligations
- 23. Establish a unified incident reporting workflow that satisfies both the AI Act's Art.73 serious incident reporting and sector-specific adverse event reporting (MDR Article 87; vehicle recall processes)
- 24. Implement document retention for the longer of the two retention periods (AI Act minimum 10 years; MDR Class III minimum 15 years)
Supply Chain and Responsibility (25–27)
- 25. Identify who is the AI system provider under the AI Act and who is the product manufacturer under the sector regulation — these may be different legal entities with separate obligations
- 26. If you are an AI component supplier (not the product OEM), ensure contractual obligations cover your AI Act provider obligations for technical documentation, incident reporting, and cooperation in conformity assessment
- 27. Map all distributor, importer, and deployer obligations under both regimes for your go-to-market structure
CLOUD Act and Data Sovereignty (28–30)
- 28. Identify all technical documentation categories that are subject to CLOUD Act compellability risk: clinical evaluation data (MDR), validation datasets containing personal data, incident reports, type-approval documentation
- 29. Store dual-regulated AI technical files on EU-sovereign infrastructure — separate from US-cloud environments under CLOUD Act jurisdiction
- 30. Review EU-US data transfer mechanisms applicable to dual-regulated AI system documentation and implement appropriate safeguards (Standard Contractual Clauses, transfer impact assessments) where EU-sovereign storage is not feasible
Further Reading
- Art.6 — High-risk AI system classification (Annex I and Annex III pathways)
- Annex I — EU harmonization legislation list (the basis for Art.6(1) classification)
- Art.40–49 — Conformity assessment procedures for high-risk AI systems
- Art.104 — Delegated acts: how Annex I and Annex III can be amended
- Art.43(2) — Third-party conformity assessment requirement for Annex I products
- Art.108 — Transitional provisions for AI systems already on the market before applicable dates
- Regulation (EU) 2024/1689 — Full EU AI Act text (Official Journal)