2026-04-27·12 min read·sota.io team

EU AI Act CEN-CENELEC Harmonised Standards Delay: Your August 2026 Conformity Assessment Plan B

Many development teams built their EU AI Act compliance roadmap around a reasonable assumption: harmonised standards from CEN/CENELEC's JTC 21 would be available before August 2026, giving them a structured path to the presumption of conformity under Art.40.

That assumption is wrong.

The harmonised standards programme under Standardisation Request M/606 is significantly delayed. As of April 2026, no JTC 21 standard has been published in the Official Journal of the EU as a harmonised standard under the AI Act. Without that OJ reference, Art.40's presumption of conformity never activates. EN ISO/IEC 42001 exists as a published standard — but it is not a harmonised standard in the legal sense that Art.40 requires.

With 97 days to the August 2 deadline for Annex III high-risk AI systems, this is not a theoretical problem. It is a conformity assessment crisis for every team that was waiting on the shortcut.

This guide explains what the delay means legally, what your options are, and how to complete a compliant Art.43 conformity assessment without harmonised standards.


What Harmonised Standards Are — and Why the OJ Reference Is the Only Thing That Matters

Under EU product regulation law, a "harmonised standard" is not just any published standard. It is a European standard (EN) that:

  1. Was developed by a recognised European Standards Organisation (CEN, CENELEC, or ETSI)
  2. Was created in response to a Commission standardisation request
  3. Has its reference published in the Official Journal of the EU

Step three is the activation event. Until the OJ reference appears, a standard — even one developed specifically for the AI Act and published by CEN — does not carry the legal presumption of conformity. Art.40(3) is explicit:

The Commission shall publish in the Official Journal of the European Union the references to harmonised standards that comply with the relevant essential requirements of this Regulation.

EN ISO/IEC 42001:2023 (AI management systems) exists. EN ISO/IEC 23894:2023 (AI risk management) exists. JTC 21 is working on a full suite. But none of them have received an OJ reference under the AI Act. Until they do, using them does not give you Art.40 presumption of conformity.

What Art.40 presumption would give you:

If a harmonised standard covering Arts.9–15 requirements existed and you complied with it, you would benefit from a rebuttable presumption that your AI system meets those requirements. This simplifies conformity assessment: your evidence base is the standard compliance record, not an independent technical documentation package.

What the delay means:

Without active OJ references, you cannot use Art.40. You must complete conformity assessment via Art.43 without the presumption shortcut. This requires full technical documentation under Annex IV, quality management under Art.17, and either Annex VI (internal control) or Annex VII (third-party notified body) assessment.


The Standardisation Mandate M/606: What Was Promised, What Was Delivered

The Commission issued Standardisation Request M/606 to CEN, CENELEC, and ETSI in May 2023. The mandate covered the requirements in Arts.9–15 of the AI Act for high-risk AI systems.

CEN/CENELEC established JTC 21 (Joint Technical Committee 21 — Artificial Intelligence) as the primary body responsible for the AI Act harmonisation work. JTC 21 has produced working groups, technical reports, and draft standards — but the timeline for final EN publication and OJ submission has slipped repeatedly.

Current JTC 21 standards landscape (April 2026):

StandardStatusArt.40 Status
EN ISO/IEC 42001:2023 (AI management systems)PublishedNot in OJ — no presumption
EN ISO/IEC 23894:2023 (AI risk management)PublishedNot in OJ — no presumption
prEN ISO/IEC 42005 (AI system impact assessment)DraftNot in OJ — no presumption
prEN ISO/IEC 42006 (AI audit requirements)DraftNot in OJ — no presumption
JTC 21 WG 3 (Data quality for AI)In developmentYears away
JTC 21 WG 4 (Trustworthiness)In developmentYears away

The Commission's own position is that harmonised standards for the AI Act will likely not be available with OJ references until 2027 or 2028 for the core Arts.9–15 requirements. This is consistent with the timelines for previous major EU product regulation standardisation mandates (MDR harmonised standards took 4+ years post-mandate).

Why the delay happened:

The AI Act requires harmonised standards to cover an unusually complex and multidisciplinary requirements space — risk management, data governance, technical documentation, human oversight, accuracy and robustness, cybersecurity. No existing ISO/IEC standard maps cleanly to this requirements set. JTC 21 is building new standards from scratch, with all the technical debate and consensus-building that implies.


What Art.41 Common Specifications Would Have Provided (and Why That Also Failed)

Art.40's delay had a designed fallback: Art.41 common specifications. If harmonised standards are not available or not adequate, the Commission can adopt implementing acts establishing common specifications that also carry presumption of conformity.

Art.41(1):

Where harmonised standards referred to in Article 40 do not exist or where the Commission considers that the relevant harmonised standards are insufficient... the Commission may, by means of implementing acts, establish common specifications.

As of April 2026, the Commission has not adopted common specifications for Annex III high-risk AI requirements. Drafts have been discussed, but no implementing act has been published in the OJ. This means Art.41 is also not available as a conformity assessment shortcut.

The result: both Art.40 (harmonised standards) and Art.41 (common specifications) are unavailable as of August 2026. Every provider of high-risk AI systems in Annex III categories must conduct conformity assessment under Art.43 without either presumption mechanism.


Art.43 Without Presumption: The Two Tracks

Art.43 defines two conformity assessment procedures for high-risk AI systems. Without harmonised standards, both remain fully available.

Track 1: Annex VI — Internal Control

Most Annex III high-risk AI systems use Track 1. This is a self-assessment procedure where the provider:

  1. Develops full technical documentation under Annex IV
  2. Implements a quality management system under Art.17
  3. Conducts and documents the conformity assessment internally
  4. Draws up an EU Declaration of Conformity (Art.47)
  5. Affixes CE marking (Art.48)

Without harmonised standards, the evidence burden shifts entirely to your internal technical documentation. You must show — through your own evidence — that you comply with each Art.9–15 requirement. This requires:

The absence of harmonised standards does not reduce these requirements. It removes the presumption mechanism that would have made meeting them easier to demonstrate.

Track 2: Annex VII — Third-Party Notified Body

Track 2 (mandatory for Annex I product-embedded AI systems under Art.43(1)(b)) involves a notified body conducting product examination and quality system assessment. Annex III systems can voluntarily use Track 2 but are not required to.

For most software-based Annex III high-risk AI providers, Track 1 is appropriate. The mandatory Track 2 applies to AI used as safety components in regulated products (machinery, medical devices, vehicles — Annex I).


How to Conduct Art.43 Internal Control Without Harmonised Standards

The absence of harmonised standards means your conformity assessment documentation needs to be more explicit about how you determined compliance with each requirement. Here is a practical structure.

Step 1: Requirements Mapping

Create a requirements matrix mapping each Art.9–15 sub-requirement to your system's specific implementation:

from dataclasses import dataclass
from typing import List, Optional
from enum import Enum

class EvidenceStatus(Enum):
    DOCUMENTED = "documented"
    GAP = "gap"
    NOT_APPLICABLE = "not_applicable"

@dataclass
class RequirementEvidence:
    article: str
    requirement: str
    implementation: str
    evidence_documents: List[str]
    status: EvidenceStatus
    gap_notes: Optional[str] = None

@dataclass
class ConformityAssessmentRecord:
    system_name: str
    assessment_date: str
    assessor: str
    requirements_matrix: List[RequirementEvidence]
    
    def gap_count(self) -> int:
        return sum(1 for r in self.requirements_matrix 
                   if r.status == EvidenceStatus.GAP)
    
    def coverage_rate(self) -> float:
        applicable = [r for r in self.requirements_matrix 
                      if r.status != EvidenceStatus.NOT_APPLICABLE]
        if not applicable:
            return 0.0
        documented = [r for r in applicable 
                      if r.status == EvidenceStatus.DOCUMENTED]
        return len(documented) / len(applicable)
    
    def is_ready_for_doc(self) -> bool:
        return self.gap_count() == 0

def build_assessment_record(system_name: str) -> ConformityAssessmentRecord:
    requirements = [
        RequirementEvidence(
            article="Art.9(1)",
            requirement="Risk management system established and maintained",
            implementation="",
            evidence_documents=[],
            status=EvidenceStatus.GAP
        ),
        RequirementEvidence(
            article="Art.9(4)",
            requirement="Risk identification covers intended purpose and reasonably foreseeable misuse",
            implementation="",
            evidence_documents=[],
            status=EvidenceStatus.GAP
        ),
        RequirementEvidence(
            article="Art.9(7)",
            requirement="Risk management measures in place, residual risk acceptable",
            implementation="",
            evidence_documents=[],
            status=EvidenceStatus.GAP
        ),
        RequirementEvidence(
            article="Art.10(2)",
            requirement="Training, validation, testing data meet quality criteria",
            implementation="",
            evidence_documents=[],
            status=EvidenceStatus.GAP
        ),
        RequirementEvidence(
            article="Art.10(3)",
            requirement="Data governance practices documented",
            implementation="",
            evidence_documents=[],
            status=EvidenceStatus.GAP
        ),
        RequirementEvidence(
            article="Art.11(1)",
            requirement="Technical documentation compiled before market placement",
            implementation="",
            evidence_documents=[],
            status=EvidenceStatus.GAP
        ),
        RequirementEvidence(
            article="Art.12(1)",
            requirement="Logging capability enabled, events captured automatically",
            implementation="",
            evidence_documents=[],
            status=EvidenceStatus.GAP
        ),
        RequirementEvidence(
            article="Art.13(1)",
            requirement="System sufficiently transparent for deployer use",
            implementation="",
            evidence_documents=[],
            status=EvidenceStatus.GAP
        ),
        RequirementEvidence(
            article="Art.14(1)",
            requirement="Human oversight measures built into system",
            implementation="",
            evidence_documents=[],
            status=EvidenceStatus.GAP
        ),
        RequirementEvidence(
            article="Art.15(1)",
            requirement="Appropriate accuracy, robustness, cybersecurity levels",
            implementation="",
            evidence_documents=[],
            status=EvidenceStatus.GAP
        ),
    ]
    
    return ConformityAssessmentRecord(
        system_name=system_name,
        assessment_date="",
        assessor="",
        requirements_matrix=requirements
    )

Step 2: Evidence Collection

For each requirement, you need contemporaneous documentation — not retrospective justifications. The absence of harmonised standards means market surveillance authorities will scrutinise your evidence quality more carefully, not less.

Evidence types that hold up under Art.73/74 market surveillance:

Step 3: CLOUD Act and Infrastructure Jurisdiction

One dimension that harmonised standards would not have resolved — and that internal control assessment must address explicitly — is infrastructure jurisdiction.

Art.12 requires that logging data be available to market surveillance authorities. Art.14 requires oversight mechanisms to function reliably. If your AI system runs on infrastructure subject to the US CLOUD Act (any US-incorporated provider, including AWS, Azure, GCP, and any US subsidiary), your logging data is potentially compellable by US authorities under 18 USC § 2703 without EU judicial oversight.

This creates a compliance tension that Art.40 harmonised standards would have left unresolved anyway. Internal control assessment under Annex VI must document:

EU-native infrastructure (hosted and incorporated entirely within the EU, with no US parent company) eliminates CLOUD Act exposure for logging and oversight data. For high-risk AI systems where MSA access to logs is part of the regulatory design, jurisdiction matters as much as the logging tool.


The Art.41 Monitoring Obligation: What to Watch

Even though common specifications are not adopted yet, Art.41 monitoring is worthwhile because:

  1. The Commission can adopt common specifications at any time via implementing act
  2. When adopted, they will likely carry retroactive compliance value for conformity assessments conducted in the interim
  3. Structuring your Art.43 assessment against draft common specification logic now reduces rework later

The Commission's AI Office has published consultation documents that preview the technical requirements likely to appear in common specifications. These are not legally binding but reflect the expected structure. Monitoring these documents (via the AI Office website and EUR-Lex) gives early warning of what formal common specifications will require.

EUR-Lex search strategy:

import datetime

def generate_eurlex_monitoring_config() -> dict:
    return {
        "base_url": "https://eur-lex.europa.eu",
        "document_types": [
            "implementing_regulation",
            "delegated_regulation", 
            "commission_decision"
        ],
        "search_terms": [
            "common specifications artificial intelligence",
            "implementing act AI Act Article 41",
            "harmonised standards AI Act Official Journal",
            "JTC 21 standardisation mandate M606"
        ],
        "alert_frequency": "weekly",
        "oj_series": "L",
        "monitor_since": datetime.date(2026, 1, 1).isoformat(),
    }

What to Do Before August 2, 2026: 30-Item Checklist

Without harmonised standards, your pre-deadline checklist focuses entirely on building a complete Annex VI internal control record.

Foundation (Items 1–8)

Technical Documentation — Annex IV (Items 9–17)

Conformity Assessment Process — Annex VI (Items 18–24)

Infrastructure Jurisdiction (Items 25–28)

Final Steps (Items 29–30)


When Harmonised Standards Arrive: Plan for Transition

When JTC 21 standards eventually receive OJ references (likely 2027–2028 for core Arts.9–15 standards), Art.40 presumption will activate. At that point:

This means well-constructed Annex VI documentation now creates a future-proof foundation. The work is not wasted when harmonised standards arrive — it becomes the baseline that the standard's conformity presumption supplements.


Summary

The EU AI Act's harmonised standards programme under mandate M/606 will not produce OJ-referenced standards before August 2026. Art.40 presumption of conformity is not available. Art.41 common specifications have not been adopted. Every Annex III high-risk AI provider faces full Annex VI internal control assessment — without shortcuts.

What this means in practice:

Start building your Annex VI record now. The 30-item checklist above covers the minimum evidence base. August 2, 2026 is 97 days away. Internal control assessment for a typical Annex III system takes 6–12 weeks when done properly — which means the window for a complete first-pass assessment is already closing.


Conducting an EU AI Act conformity assessment for your high-risk AI system? sota.io provides EU-native infrastructure with no US parent company — eliminating CLOUD Act exposure for logging data and oversight mechanisms before your MSA access documentation is written.