2026-04-23·14 min read·

EU AI Act Art.38: Other Union Harmonised Legislation — Dual-Coverage Conformity Assessment, NLF Integration, and CE Marking for High-Risk AI Systems in Regulated Sectors (2026)

Article 38 of the EU AI Act addresses one of the most practically significant regulatory design questions the Regulation needed to answer: what happens when a high-risk AI system is already subject to another EU harmonised legislation framework that also requires a conformity assessment? A medical AI system embedded in a Class IIb device is subject to the Medical Device Regulation (MDR). A robotic AI system that controls industrial machinery is subject to the Machinery Regulation. Both of those systems are also, by virtue of their AI capabilities, subject to the EU AI Act's Chapter III requirements for high-risk AI systems. Art.38 resolves the resulting dual-coverage problem by integrating the AI Act conformity obligation into the existing New Legislative Framework (NLF) conformity assessment procedure rather than requiring a parallel, AI-Act-specific conformity assessment on top of the NLF process.

Understanding Art.38 is not just a theoretical exercise. For providers of AI-embedded medical devices, machinery, civil aviation components, or radio equipment, the question of whether to treat the AI Act as an additional regulatory burden requiring a second conformity assessment track — or whether Art.38 allows them to address AI Act compliance within their existing NLF certification workflow — determines project scope, cost, timeline, and notified body selection for every product development cycle between now and 2030.

The Dual-Coverage Problem Art.38 Solves

The EU AI Act classifies as high-risk certain AI systems used in specific application domains. Annex III of the AI Act contains a domain-based list covering areas such as critical infrastructure, education, employment, law enforcement, and biometric identification. Annex I of the AI Act contains a different list: it identifies Union harmonised legislation covering products in sectors such as medical devices, civil aviation equipment, vehicles, pressure equipment, machinery, radio equipment, and others.

High-risk AI systems that are components of, or embedded within, products regulated by Annex I legislation face a structural problem without Art.38: they would need to satisfy both the AI Act's Arts.8–15 substantive requirements (risk management, data governance, technical documentation, logging, transparency, human oversight, accuracy, robustness, cybersecurity) and the existing conformity assessment procedures of the applicable Annex I legislation, which themselves impose overlapping substantive requirements (particularly in medical devices and machinery) and separate third-party certification obligations.

Without Art.38, a provider of a CE-marked Class III medical device with embedded AI would need to satisfy:

This would mean two separate certification tracks, potentially involving two separate notified bodies, generating two separate technical files, and maintaining two separate ongoing compliance obligations. For sectors with complex existing certification requirements — medical devices above all — this would impose costs and timelines that the EU legislator recognised as disproportionate given that the existing NLF conformity procedures already address many of the same substantive concerns.

Art.38 resolves this by establishing that the AI Act conformity assessment for Annex I-sector high-risk AI systems is subsumed within the conformity assessment required by the applicable Annex I legislation.

The Annex I Sector Mapping

Art.38 applies specifically to high-risk AI systems referenced in points 1, 6, and 7 of Annex I — the safety-component-in-product categories — that are also covered by the Union harmonised legislation listed in Annex I. The Annex I legislation list includes:

Machinery and related sectors:

The practical scope is therefore narrower than the full Annex I list: Art.38 specifically applies when the AI system is a safety component of or is itself a product falling under one of these Annex I legislative frameworks, and that product's conformity assessment procedure covers the same safety and performance requirements that the AI Act addresses through Arts.8–15.

The Subsumed-Assessment Principle

The core rule of Art.38 is what can be called the subsumed-assessment principle: high-risk AI systems covered by both the AI Act and Annex I legislation shall be subject only to the conformity assessment procedure required by the applicable Annex I legislation.

This means that a provider who completes a conformity assessment under, for example, MDR Annex IX or Annex X has simultaneously — by virtue of Art.38 — completed the AI Act conformity assessment for that system, provided that the MDR conformity assessment has covered the AI Act's Arts.8–15 requirements.

The subsumed-assessment principle has three practical consequences:

First, no parallel AI-Act-specific conformity assessment procedure is required. A provider does not need to engage a separate AI Act notified body or conduct a separate AI Act conformity assessment procedure in addition to the MDR or Machinery Regulation procedure. The NLF procedure is the AI Act procedure for Annex I products.

Second, the notified body competent under the Annex I legislation is competent to certify AI Act compliance. A notified body designated under the MDR that conducts an MDR conformity assessment covering a medical AI system is simultaneously — under Art.38 — certifying the system's compliance with the AI Act. The notified body does not need separate AI Act designation to exercise this competency for Annex I products. This avoids the coordination problem of requiring providers to work with two separate notified bodies for a single product.

Third, the technical documentation requirements of AI Act Arts.11 and 13 shall be incorporated into the technical documentation required under the applicable Annex I legislation. Art.38(4) makes explicit that there is no separate AI Act technical file requirement for Annex I products. The AI Act documentation — risk management records, training data governance documentation, logging records, transparency information, human oversight design rationale, accuracy and robustness benchmarks — all become part of the NLF technical file rather than a separate AI Act technical file.

What Remains AI-Act-Specific Under Art.38

The subsumed-assessment principle is broad but not unlimited. Several AI Act requirements remain applicable to Annex I-sector high-risk AI systems even when Art.38 applies, because they operate independently of the conformity assessment procedure:

Arts.8–15 substantive requirements: These requirements remain fully applicable. Art.38 changes which procedure verifies compliance with those requirements — NLF procedure rather than AI Act Art.43 procedure — but it does not disapply the requirements themselves. A medical AI system must still comply with Art.9 (risk management), Art.10 (data governance), Art.12 (logging), Art.14 (human oversight), and Art.15 (accuracy, robustness, cybersecurity). It is simply that compliance is demonstrated and verified through the MDR conformity assessment rather than through a separate AI Act conformity assessment.

Post-market surveillance and incident reporting (Art.72 and related provisions): The AI Act's post-market surveillance obligations and serious incident reporting to national authorities apply alongside any equivalent obligations under Annex I legislation. For medical devices, MDR post-market clinical follow-up and MDR serious incident reporting under Art.87 MDR are coordinated with AI Act obligations, but they are not replaced.

Market surveillance cooperation: National market surveillance authorities exercising AI Act functions can coordinate with those exercising NLF functions, but AI Act market surveillance powers are not transferred to Annex I legislation enforcement bodies.

Registration obligations: The EUDAMED registration requirements under MDR apply for medical devices. AI Act registration in the EU database under Art.71 may create additional obligations for Annex I-sector high-risk AI systems, though the interface between EUDAMED and the AI Act database for Annex I products is an area where implementing guidance remains to be issued.

The Expanded Notified Body Mandate

Art.38(2) grants an expanded mandate to notified bodies designated under Annex I legislation. A notified body designated and accredited under any of the Annex I legislative instruments shall be entitled to assess the conformity of a high-risk AI system that is a safety component of or is itself a product covered by that legislation, including the AI Act requirements under Arts.8–15.

This expanded mandate is automatic — it does not require the notified body to obtain separate AI Act designation. The rationale is that the expertise required to assess AI Act compliance for a medical AI system embedded in a medical device is fundamentally medical-device expertise. A notified body accredited to assess medical device safety under MDR Annex IX possesses, by construction, the technical competence required to assess whether an AI component of that device meets the AI Act's accuracy, robustness, and risk management requirements in the context of that device's intended purpose.

The expanded mandate is also sector-specific. A notified body designated under MDR cannot use Art.38(2) to assess AI Act compliance for a machinery AI system — it can only apply its expanded mandate within its designated domain. Cross-sector use of the Art.38 expanded mandate would require separate designation under the relevant Annex I legislation for that sector.

Technical Documentation Integration

Art.38(4) establishes the technical documentation integration requirement as a mandatory corollary of the subsumed-assessment principle. The AI Act requirements in Arts.11 and 13 — technical documentation and transparency and information provision respectively — shall be addressed within the technical documentation required under the applicable Annex I legislation.

In practice, this means:

For MDR-regulated medical AI: The MDR technical documentation required under MDR Annex II (technical documentation) and Annex III (post-market surveillance technical documentation) must be extended to cover the EU AI Act documentation requirements:

For Machinery Regulation machinery AI: The Machinery Regulation's technical file under its Annex IV must incorporate:

For RED-regulated radio equipment AI: The Radio Equipment Directive's technical documentation requirements must cover AI Act requirements relevant to radio-connected AI systems, particularly where the system processes biometric data, makes automated decisions affecting users, or operates in critical infrastructure contexts.

CE Marking and Declaration of Conformity Under Art.38

For products covered by both the AI Act and Annex I legislation, the CE marking and Declaration of Conformity (DoC) procedure works as follows under Art.38:

The CE marking on the product is affixed once — it is the CE marking required by the applicable Annex I legislation. There is no separate AI Act CE marking. The CE marking, once affixed following a conformity assessment procedure that has covered both the Annex I requirements and the integrated AI Act Arts.8–15 requirements, signals that both frameworks have been satisfied.

The DoC issued by the manufacturer must identify the applicable legislation. For an AI-embedded medical device, the DoC must reference both the MDR and the EU AI Act (Regulation (EU) 2024/1689). The DoC references the standards and technical specifications used to demonstrate conformity with both instruments.

The notified body certificate issued following the conformity assessment must cover both frameworks. An MDR Annex IX QMS and technical documentation certificate that does not explicitly address the AI Act requirements is not sufficient to enable a combined CE marking DoC — the certificate must either explicitly extend to AI Act compliance or the AI Act compliance must be demonstrable through the elements already covered by the certificate.

Sector-by-Sector Examples

Medical AI (MDR/IVDR)

A Class IIb medical device incorporating an AI algorithm that analyses ECG data to detect atrial fibrillation is subject to MDR under Annex VIII Rule 11 (software as a medical device at Class IIb) and to the EU AI Act as a high-risk AI system under Annex III point 5(a) (AI systems intended for use in safety-critical healthcare decisions).

Under Art.38, the provider's MDR Annex IX conformity assessment (QMS review plus technical documentation assessment by the notified body) is the AI Act conformity assessment. The provider does not engage a separate AI Act notified body. The MDR technical documentation is extended to cover Art.9 risk management (AI-specific failure modes including false negatives in arrhythmia detection), Art.10 data governance (ECG training dataset demographics, validation set clinical trial provenance), Art.12 logging (per-decision logging for the duration required by Art.12), Art.13 transparency (algorithm limitations disclosed in IFU, performance by subpopulation), Art.14 human oversight (clinician override workflow, confidence threshold triggering human review), and Art.15 accuracy metrics (sensitivity, specificity, AUC validated against predefined performance criteria).

The DoC references both MDR and EU AI Act. The CE marking covers both. The MDR notified body's certificate documents the extended scope.

Industrial Machinery AI (Machinery Regulation)

An industrial robot using AI-based vision to guide a welding torch in an automotive assembly line is subject to the Machinery Regulation as a machinery product and to the EU AI Act as a high-risk AI system under Annex III point 3(b) (AI systems used in machinery for monitoring or controlling processes where safety is involved).

Under Art.38, the Machinery Regulation conformity assessment procedure applicable to this machinery type (which for certain machinery requires third-party involvement under Machinery Regulation Annex IX) covers the AI Act conformity assessment. The technical file is extended with AI Act documentation as described above. The machinery's CE marking covers both frameworks.

Radio Equipment AI (RED)

An AI-enabled smart meter that makes automated decisions about energy distribution and communicates wirelessly is subject to RED (radio equipment directive) for its radio function and to the EU AI Act for its automated decision-making in critical infrastructure under Annex III point 2. Under Art.38, the RED conformity assessment procedure covers the AI Act assessment, with the technical file extended to cover the AI Act documentation.

The Art.38 × Art.43 Interface

Art.43 is the primary EU AI Act conformity assessment provision for high-risk AI systems that are not covered by Annex I legislation. For Annex I products, Art.43 is effectively displaced by Art.38 for the conformity assessment procedure question — providers use their Annex I legislation procedure, not the Art.43 procedure.

However, Art.43(4) interacts with Art.38 in an important way: for certain high-risk AI systems that would require third-party conformity assessment under Art.43 based on their risk profile, the applicable Annex I legislation procedure must be capable of providing equivalent third-party oversight. If the Annex I procedure for a particular product type is a manufacturer self-declaration without notified body involvement, and the AI Act's risk profile would have required notified body involvement under Art.43, this creates a tension that providers must resolve in consultation with market surveillance authorities.

In practice, for the highest-risk Annex I products — Class III medical devices, certain machinery — the Annex I procedure already requires notified body involvement, and this third-party oversight also covers the AI Act requirements under Art.38. For lower-risk Annex I products where self-declaration is permitted under the applicable NLF legislation, the interaction with AI Act risk levels requires case-by-case assessment.

class Art38ComplianceChecker:
    """
    Determines Art.38 applicability and integration requirements
    for high-risk AI systems in NLF-regulated sectors.
    """

    ANNEX_I_LEGISLATION = {
        "MDR": {
            "regulation": "EU 2017/745",
            "scope": "Medical devices",
            "conformity_routes": ["Annex IX", "Annex X", "Annex XI"],
            "notified_body_required": ["Class IIa", "Class IIb", "Class III"],
        },
        "IVDR": {
            "regulation": "EU 2017/746",
            "scope": "In vitro diagnostic devices",
            "conformity_routes": ["Annex IX", "Annex X"],
            "notified_body_required": ["Class B", "Class C", "Class D"],
        },
        "Machinery_Regulation": {
            "regulation": "EU 2023/1230",
            "scope": "Machinery and related products",
            "conformity_routes": ["Annex IX (high-risk)", "Self-declaration (standard)"],
            "notified_body_required": ["Annex I, Part A machinery"],
        },
        "RED": {
            "regulation": "EU 2014/53",
            "scope": "Radio equipment",
            "conformity_routes": ["Harmonised standard", "Notified body (Art.17)"],
            "notified_body_required": ["Art.17 categories"],
        },
        "ATEX": {
            "regulation": "EU 2014/34",
            "scope": "Equipment for explosive atmospheres",
            "conformity_routes": ["Module B+D", "Module B+F", "Module G", "Module H"],
            "notified_body_required": ["All categories"],
        },
    }

    AI_ACT_INTEGRATION_REQUIREMENTS = {
        "Art9": "Risk management system — integrate AI-specific failure modes into NLF risk file",
        "Art10": "Data governance — training/validation dataset provenance in technical file",
        "Art11": "Technical documentation — merge into NLF technical file, no separate AI Act file",
        "Art12": "Logging — automatic logging provisions documented in technical file",
        "Art13": "Transparency — user-facing information integrated into NLF instructions for use",
        "Art14": "Human oversight — override/correction mechanisms documented and tested",
        "Art15": "Accuracy/robustness/cybersecurity — validated metrics in technical documentation",
    }

    def __init__(self, product_sector: str, ai_system_intended_purpose: str):
        self.sector = product_sector
        self.intended_purpose = ai_system_intended_purpose
        self.applicable_legislation = None
        self.art38_applies = False
        self.integration_gaps = []

    def assess_art38_applicability(self) -> dict:
        """Check whether Art.38 applies and which NLF procedure governs."""
        if self.sector in self.ANNEX_I_LEGISLATION:
            self.applicable_legislation = self.ANNEX_I_LEGISLATION[self.sector]
            self.art38_applies = True
            return {
                "art38_applies": True,
                "governing_legislation": self.sector,
                "regulation": self.applicable_legislation["regulation"],
                "available_conformity_routes": self.applicable_legislation["conformity_routes"],
                "notified_body_required_for": self.applicable_legislation["notified_body_required"],
            }
        return {
            "art38_applies": False,
            "note": "Use Art.43 procedure directly",
        }

    def identify_integration_gaps(self, existing_nfl_technical_file_sections: list) -> list:
        """Identify which AI Act Arts.8-15 requirements are not yet covered in the NLF technical file."""
        ai_act_requirements = set(self.AI_ACT_INTEGRATION_REQUIREMENTS.keys())
        covered_requirements = set()

        coverage_mapping = {
            "risk_assessment": "Art9",
            "clinical_evaluation": "Art9",
            "training_data_documentation": "Art10",
            "performance_validation": "Art15",
            "instructions_for_use": "Art13",
            "post_market_surveillance_plan": "Art9",
            "usability_engineering": "Art14",
            "cybersecurity_documentation": "Art15",
        }

        for section in existing_nfl_technical_file_sections:
            if section.lower().replace(" ", "_") in coverage_mapping:
                covered_requirements.add(coverage_mapping[section.lower().replace(" ", "_")])

        self.integration_gaps = [
            {
                "requirement": req,
                "description": self.AI_ACT_INTEGRATION_REQUIREMENTS[req],
                "status": "NEEDS_INTEGRATION",
            }
            for req in ai_act_requirements - covered_requirements
        ]
        return self.integration_gaps

    def generate_doc_checklist(self) -> list:
        """Generate the documentation integration checklist for the NLF technical file."""
        return [
            f"Extend NLF risk file: {self.AI_ACT_INTEGRATION_REQUIREMENTS['Art9']}",
            f"Add data governance annex: {self.AI_ACT_INTEGRATION_REQUIREMENTS['Art10']}",
            f"Integrate AI Act docs into NLF technical file: {self.AI_ACT_INTEGRATION_REQUIREMENTS['Art11']}",
            f"Add logging specification: {self.AI_ACT_INTEGRATION_REQUIREMENTS['Art12']}",
            f"Extend IFU/instructions: {self.AI_ACT_INTEGRATION_REQUIREMENTS['Art13']}",
            f"Document human override design: {self.AI_ACT_INTEGRATION_REQUIREMENTS['Art14']}",
            f"Add AI performance validation section: {self.AI_ACT_INTEGRATION_REQUIREMENTS['Art15']}",
            "Update DoC to reference EU AI Act 2024/1689",
            "Confirm notified body certificate covers Art.38 extended scope",
            "Coordinate NLF post-market surveillance with Art.72 AI Act obligations",
        ]


# Usage example
checker = Art38ComplianceChecker(
    product_sector="MDR",
    ai_system_intended_purpose="ECG analysis for atrial fibrillation detection"
)

result = checker.assess_art38_applicability()
print(f"Art.38 applies: {result['art38_applies']}")
print(f"Governing legislation: {result['governing_legislation']}")
print(f"Conformity routes: {result['available_conformity_routes']}")

gaps = checker.identify_integration_gaps(
    existing_nfl_technical_file_sections=[
        "clinical_evaluation",
        "usability_engineering",
        "post_market_surveillance_plan",
    ]
)
print(f"\nIntegration gaps: {len(gaps)} items require documentation extension")
for gap in gaps:
    print(f"  - {gap['requirement']}: {gap['description']}")

checklist = checker.generate_doc_checklist()
print(f"\nDocumentation integration checklist ({len(checklist)} items):")
for i, item in enumerate(checklist, 1):
    print(f"  {i}. {item}")

The Art.38 Compliance Matrix

NLF LegislationAI Act CoverageNotified Body RequiredDocumentation Integration PointDoC Extension Required
MDR Class IIIFull Annex I applicabilityYes (Annex IX or X NB)MDR Annex II Technical DocumentationYes — reference EU AI Act
MDR Class IIbFull Annex I applicabilityYes (Annex IX NB)MDR Annex II Technical DocumentationYes — reference EU AI Act
MDR Class IIaFull Annex I applicabilityYes (Annex IX NB, limited)MDR Annex II Technical DocumentationYes — reference EU AI Act
MDR Class IFull Annex I applicabilityNo (self-declaration)MDR Technical DocumentationYes — reference EU AI Act
IVDR Class DFull Annex I applicabilityYes (NB + EU reference lab)IVDR Technical DocumentationYes
IVDR Class CFull Annex I applicabilityYes (NB)IVDR Technical DocumentationYes
Machinery (Annex I Part A)Full applicabilityYes (Annex IX NB)Machinery Technical FileYes
Machinery (standard)Full applicabilityNo (self-declaration)Machinery Technical FileYes
RED (Art.17 categories)Full applicabilityYes (NB)RED Technical FileYes
RED (standard)Full applicabilityNo (self-declaration)RED Technical FileYes
ATEX Group I/II Cat.1Full applicabilityYes (NB)ATEX Technical FileYes

26-Item Provider Checklist for Art.38 Compliance

Applicability Assessment (do this first)

  1. Confirm the AI system is a safety component of, or is itself, a product covered by Annex I legislation (points 1, 6, or 7 of AI Act Annex I)
  2. Identify which specific Annex I legislation governs the product (MDR, IVDR, Machinery Regulation, RED, ATEX, or other listed instrument)
  3. Confirm the product's classification under the applicable Annex I legislation (e.g., MDR device class, machinery category)
  4. Verify that the AI system's intended purpose triggers EU AI Act high-risk classification under Annex III in addition to the Annex I trigger
  5. If both Annex I and Annex III triggers apply, confirm that Art.38 governs the conformity assessment procedure for the Annex I aspects

Conformity Assessment Procedure (Art.38 pathway) 6. Identify which conformity assessment route applies under the Annex I legislation (e.g., MDR Annex IX, Machinery Regulation Annex IX, RED self-declaration) 7. Confirm that the selected notified body (if required) has been notified under the applicable Annex I legislation 8. Confirm with the notified body that their assessment scope will explicitly cover EU AI Act Arts.8–15 requirements in addition to the Annex I legislation requirements 9. Obtain written confirmation from the notified body of their expanded Art.38(2) mandate 10. Schedule the conformity assessment timeline to address both Annex I legislation requirements and AI Act requirements in a single assessment workflow

Technical Documentation Integration 11. Audit the existing NLF technical file structure to identify all sections where AI Act Arts.8–15 requirements require additional content 12. Extend the NLF risk management file to cover AI-specific hazard categories under Art.9 (unexpected outputs, drift, adversarial robustness, distributional shift) 13. Add a data governance annex under Art.10 documenting training dataset provenance, demographic coverage, validation set independence, and bias assessment 14. Integrate the Art.12 automatic logging specification into the technical file, specifying log retention period, granularity, and storage security 15. Extend the NLF instructions for use or equivalent user documentation to cover all Art.13 transparency requirements (intended purpose, known limitations, performance by relevant subpopulation, human oversight instructions) 16. Document the Art.14 human oversight design, including all designated human oversight points, override mechanisms, and procedures for disabling or correcting the AI system 17. Add the Art.15 performance validation section documenting accuracy, robustness, and cybersecurity testing results against predefined criteria 18. Ensure the integrated NLF + AI Act technical file satisfies both the NLF technical documentation requirements and AI Act Annex IV requirements

Declaration of Conformity 19. Draft a Declaration of Conformity (DoC) that explicitly references both the Annex I legislation and EU AI Act Regulation (EU) 2024/1689 20. Verify that the notified body certificate, if issued, explicitly references the AI Act extended scope under Art.38 21. Review CE marking affixation requirements under both frameworks — a single CE marking covers both when Art.38 applies

Post-Market Obligations 22. Integrate AI Act Art.72 post-market surveillance obligations with the NLF post-market surveillance plan (e.g., MDR post-market clinical follow-up plan, Machinery Regulation post-market surveillance requirements) 23. Map AI Act serious incident reporting obligations against NLF serious incident reporting obligations (e.g., MDR Art.87 incident reporting) to ensure coordinated reporting without duplication or gaps 24. Establish a process for monitoring AI performance in deployment against Art.15 accuracy thresholds, with corrective action procedures triggered by performance degradation

Registration and Traceability 25. Confirm registration obligations under both frameworks — NLF registration (e.g., EUDAMED for medical devices) and AI Act Art.71 EU database registration if applicable 26. Ensure the product's UDI (Unique Device Identifier, for MDR/IVDR products) or equivalent traceability mechanism is documented in both the NLF technical file and any AI Act registration record

See Also