EU AI Act CEN-CENELEC Harmonised Standards Delay: Your August 2026 Conformity Assessment Plan B
Many development teams built their EU AI Act compliance roadmap around a reasonable assumption: harmonised standards from CEN/CENELEC's JTC 21 would be available before August 2026, giving them a structured path to the presumption of conformity under Art.40.
That assumption is wrong.
The harmonised standards programme under Standardisation Request M/606 is significantly delayed. As of April 2026, no JTC 21 standard has been published in the Official Journal of the EU as a harmonised standard under the AI Act. Without that OJ reference, Art.40's presumption of conformity never activates. EN ISO/IEC 42001 exists as a published standard — but it is not a harmonised standard in the legal sense that Art.40 requires.
With 97 days to the August 2 deadline for Annex III high-risk AI systems, this is not a theoretical problem. It is a conformity assessment crisis for every team that was waiting on the shortcut.
This guide explains what the delay means legally, what your options are, and how to complete a compliant Art.43 conformity assessment without harmonised standards.
What Harmonised Standards Are — and Why the OJ Reference Is the Only Thing That Matters
Under EU product regulation law, a "harmonised standard" is not just any published standard. It is a European standard (EN) that:
- Was developed by a recognised European Standards Organisation (CEN, CENELEC, or ETSI)
- Was created in response to a Commission standardisation request
- Has its reference published in the Official Journal of the EU
Step three is the activation event. Until the OJ reference appears, a standard — even one developed specifically for the AI Act and published by CEN — does not carry the legal presumption of conformity. Art.40(3) is explicit:
The Commission shall publish in the Official Journal of the European Union the references to harmonised standards that comply with the relevant essential requirements of this Regulation.
EN ISO/IEC 42001:2023 (AI management systems) exists. EN ISO/IEC 23894:2023 (AI risk management) exists. JTC 21 is working on a full suite. But none of them have received an OJ reference under the AI Act. Until they do, using them does not give you Art.40 presumption of conformity.
What Art.40 presumption would give you:
If a harmonised standard covering Arts.9–15 requirements existed and you complied with it, you would benefit from a rebuttable presumption that your AI system meets those requirements. This simplifies conformity assessment: your evidence base is the standard compliance record, not an independent technical documentation package.
What the delay means:
Without active OJ references, you cannot use Art.40. You must complete conformity assessment via Art.43 without the presumption shortcut. This requires full technical documentation under Annex IV, quality management under Art.17, and either Annex VI (internal control) or Annex VII (third-party notified body) assessment.
The Standardisation Mandate M/606: What Was Promised, What Was Delivered
The Commission issued Standardisation Request M/606 to CEN, CENELEC, and ETSI in May 2023. The mandate covered the requirements in Arts.9–15 of the AI Act for high-risk AI systems.
CEN/CENELEC established JTC 21 (Joint Technical Committee 21 — Artificial Intelligence) as the primary body responsible for the AI Act harmonisation work. JTC 21 has produced working groups, technical reports, and draft standards — but the timeline for final EN publication and OJ submission has slipped repeatedly.
Current JTC 21 standards landscape (April 2026):
| Standard | Status | Art.40 Status |
|---|---|---|
| EN ISO/IEC 42001:2023 (AI management systems) | Published | Not in OJ — no presumption |
| EN ISO/IEC 23894:2023 (AI risk management) | Published | Not in OJ — no presumption |
| prEN ISO/IEC 42005 (AI system impact assessment) | Draft | Not in OJ — no presumption |
| prEN ISO/IEC 42006 (AI audit requirements) | Draft | Not in OJ — no presumption |
| JTC 21 WG 3 (Data quality for AI) | In development | Years away |
| JTC 21 WG 4 (Trustworthiness) | In development | Years away |
The Commission's own position is that harmonised standards for the AI Act will likely not be available with OJ references until 2027 or 2028 for the core Arts.9–15 requirements. This is consistent with the timelines for previous major EU product regulation standardisation mandates (MDR harmonised standards took 4+ years post-mandate).
Why the delay happened:
The AI Act requires harmonised standards to cover an unusually complex and multidisciplinary requirements space — risk management, data governance, technical documentation, human oversight, accuracy and robustness, cybersecurity. No existing ISO/IEC standard maps cleanly to this requirements set. JTC 21 is building new standards from scratch, with all the technical debate and consensus-building that implies.
What Art.41 Common Specifications Would Have Provided (and Why That Also Failed)
Art.40's delay had a designed fallback: Art.41 common specifications. If harmonised standards are not available or not adequate, the Commission can adopt implementing acts establishing common specifications that also carry presumption of conformity.
Art.41(1):
Where harmonised standards referred to in Article 40 do not exist or where the Commission considers that the relevant harmonised standards are insufficient... the Commission may, by means of implementing acts, establish common specifications.
As of April 2026, the Commission has not adopted common specifications for Annex III high-risk AI requirements. Drafts have been discussed, but no implementing act has been published in the OJ. This means Art.41 is also not available as a conformity assessment shortcut.
The result: both Art.40 (harmonised standards) and Art.41 (common specifications) are unavailable as of August 2026. Every provider of high-risk AI systems in Annex III categories must conduct conformity assessment under Art.43 without either presumption mechanism.
Art.43 Without Presumption: The Two Tracks
Art.43 defines two conformity assessment procedures for high-risk AI systems. Without harmonised standards, both remain fully available.
Track 1: Annex VI — Internal Control
Most Annex III high-risk AI systems use Track 1. This is a self-assessment procedure where the provider:
- Develops full technical documentation under Annex IV
- Implements a quality management system under Art.17
- Conducts and documents the conformity assessment internally
- Draws up an EU Declaration of Conformity (Art.47)
- Affixes CE marking (Art.48)
Without harmonised standards, the evidence burden shifts entirely to your internal technical documentation. You must show — through your own evidence — that you comply with each Art.9–15 requirement. This requires:
- Risk management documentation (Art.9): full risk register, residual risk assessment, risk mitigation log
- Data governance documentation (Art.10): training dataset provenance, data quality criteria, bias detection methodology
- Technical documentation (Art.11 + Annex IV): system architecture, component list, training approach, performance metrics, limitations
- Logging documentation (Art.12): capability statement, log format, retention specification
- Transparency documentation (Art.13): instructions for use compliant with Art.13(3)
- Human oversight documentation (Art.14): oversight mechanism specification, intervention capability statement
- Accuracy/robustness documentation (Art.15): performance metrics, adversarial robustness testing, cybersecurity baseline
The absence of harmonised standards does not reduce these requirements. It removes the presumption mechanism that would have made meeting them easier to demonstrate.
Track 2: Annex VII — Third-Party Notified Body
Track 2 (mandatory for Annex I product-embedded AI systems under Art.43(1)(b)) involves a notified body conducting product examination and quality system assessment. Annex III systems can voluntarily use Track 2 but are not required to.
For most software-based Annex III high-risk AI providers, Track 1 is appropriate. The mandatory Track 2 applies to AI used as safety components in regulated products (machinery, medical devices, vehicles — Annex I).
How to Conduct Art.43 Internal Control Without Harmonised Standards
The absence of harmonised standards means your conformity assessment documentation needs to be more explicit about how you determined compliance with each requirement. Here is a practical structure.
Step 1: Requirements Mapping
Create a requirements matrix mapping each Art.9–15 sub-requirement to your system's specific implementation:
from dataclasses import dataclass
from typing import List, Optional
from enum import Enum
class EvidenceStatus(Enum):
DOCUMENTED = "documented"
GAP = "gap"
NOT_APPLICABLE = "not_applicable"
@dataclass
class RequirementEvidence:
article: str
requirement: str
implementation: str
evidence_documents: List[str]
status: EvidenceStatus
gap_notes: Optional[str] = None
@dataclass
class ConformityAssessmentRecord:
system_name: str
assessment_date: str
assessor: str
requirements_matrix: List[RequirementEvidence]
def gap_count(self) -> int:
return sum(1 for r in self.requirements_matrix
if r.status == EvidenceStatus.GAP)
def coverage_rate(self) -> float:
applicable = [r for r in self.requirements_matrix
if r.status != EvidenceStatus.NOT_APPLICABLE]
if not applicable:
return 0.0
documented = [r for r in applicable
if r.status == EvidenceStatus.DOCUMENTED]
return len(documented) / len(applicable)
def is_ready_for_doc(self) -> bool:
return self.gap_count() == 0
def build_assessment_record(system_name: str) -> ConformityAssessmentRecord:
requirements = [
RequirementEvidence(
article="Art.9(1)",
requirement="Risk management system established and maintained",
implementation="",
evidence_documents=[],
status=EvidenceStatus.GAP
),
RequirementEvidence(
article="Art.9(4)",
requirement="Risk identification covers intended purpose and reasonably foreseeable misuse",
implementation="",
evidence_documents=[],
status=EvidenceStatus.GAP
),
RequirementEvidence(
article="Art.9(7)",
requirement="Risk management measures in place, residual risk acceptable",
implementation="",
evidence_documents=[],
status=EvidenceStatus.GAP
),
RequirementEvidence(
article="Art.10(2)",
requirement="Training, validation, testing data meet quality criteria",
implementation="",
evidence_documents=[],
status=EvidenceStatus.GAP
),
RequirementEvidence(
article="Art.10(3)",
requirement="Data governance practices documented",
implementation="",
evidence_documents=[],
status=EvidenceStatus.GAP
),
RequirementEvidence(
article="Art.11(1)",
requirement="Technical documentation compiled before market placement",
implementation="",
evidence_documents=[],
status=EvidenceStatus.GAP
),
RequirementEvidence(
article="Art.12(1)",
requirement="Logging capability enabled, events captured automatically",
implementation="",
evidence_documents=[],
status=EvidenceStatus.GAP
),
RequirementEvidence(
article="Art.13(1)",
requirement="System sufficiently transparent for deployer use",
implementation="",
evidence_documents=[],
status=EvidenceStatus.GAP
),
RequirementEvidence(
article="Art.14(1)",
requirement="Human oversight measures built into system",
implementation="",
evidence_documents=[],
status=EvidenceStatus.GAP
),
RequirementEvidence(
article="Art.15(1)",
requirement="Appropriate accuracy, robustness, cybersecurity levels",
implementation="",
evidence_documents=[],
status=EvidenceStatus.GAP
),
]
return ConformityAssessmentRecord(
system_name=system_name,
assessment_date="",
assessor="",
requirements_matrix=requirements
)
Step 2: Evidence Collection
For each requirement, you need contemporaneous documentation — not retrospective justifications. The absence of harmonised standards means market surveillance authorities will scrutinise your evidence quality more carefully, not less.
Evidence types that hold up under Art.73/74 market surveillance:
- Risk management: Dated risk register entries, risk committee minutes, remediation tickets with closure dates
- Data governance: Dataset cards, data quality test results, bias evaluation reports
- Technical documentation: Architecture decision records (ADRs), model cards, performance benchmark results
- Logging: Log format specification, sample log output, retention policy document
- Human oversight: UX specifications showing intervention UI, test results showing override capability
Step 3: CLOUD Act and Infrastructure Jurisdiction
One dimension that harmonised standards would not have resolved — and that internal control assessment must address explicitly — is infrastructure jurisdiction.
Art.12 requires that logging data be available to market surveillance authorities. Art.14 requires oversight mechanisms to function reliably. If your AI system runs on infrastructure subject to the US CLOUD Act (any US-incorporated provider, including AWS, Azure, GCP, and any US subsidiary), your logging data is potentially compellable by US authorities under 18 USC § 2703 without EU judicial oversight.
This creates a compliance tension that Art.40 harmonised standards would have left unresolved anyway. Internal control assessment under Annex VI must document:
- Infrastructure provider and parent company jurisdiction
- CLOUD Act applicability assessment
- Logging data access control (who can access logs, under what legal process)
- Contingency for foreign government access to logging evidence
EU-native infrastructure (hosted and incorporated entirely within the EU, with no US parent company) eliminates CLOUD Act exposure for logging and oversight data. For high-risk AI systems where MSA access to logs is part of the regulatory design, jurisdiction matters as much as the logging tool.
The Art.41 Monitoring Obligation: What to Watch
Even though common specifications are not adopted yet, Art.41 monitoring is worthwhile because:
- The Commission can adopt common specifications at any time via implementing act
- When adopted, they will likely carry retroactive compliance value for conformity assessments conducted in the interim
- Structuring your Art.43 assessment against draft common specification logic now reduces rework later
The Commission's AI Office has published consultation documents that preview the technical requirements likely to appear in common specifications. These are not legally binding but reflect the expected structure. Monitoring these documents (via the AI Office website and EUR-Lex) gives early warning of what formal common specifications will require.
EUR-Lex search strategy:
import datetime
def generate_eurlex_monitoring_config() -> dict:
return {
"base_url": "https://eur-lex.europa.eu",
"document_types": [
"implementing_regulation",
"delegated_regulation",
"commission_decision"
],
"search_terms": [
"common specifications artificial intelligence",
"implementing act AI Act Article 41",
"harmonised standards AI Act Official Journal",
"JTC 21 standardisation mandate M606"
],
"alert_frequency": "weekly",
"oj_series": "L",
"monitor_since": datetime.date(2026, 1, 1).isoformat(),
}
What to Do Before August 2, 2026: 30-Item Checklist
Without harmonised standards, your pre-deadline checklist focuses entirely on building a complete Annex VI internal control record.
Foundation (Items 1–8)
- 1. Confirm your system falls under Annex III (not Annex I — different timeline)
- 2. Confirm Art.6(3) self-determination is documented if claiming no significant risk
- 3. Document your Art.3(9) "placed on the market" date (first EU commercial availability)
- 4. Document Art.3(7) "provider" status and supply chain (Art.25 obligations)
- 5. Establish quality management system under Art.17 (documented procedures)
- 6. Assign qualified persons responsible for conformity assessment
- 7. Confirm post-market monitoring plan (Art.72) is in place
- 8. Confirm serious incident reporting process (Art.73) is operational
Technical Documentation — Annex IV (Items 9–17)
- 9. General description: system name, version, intended purpose, geographic scope
- 10. Design and development: architecture, training approach, model type, components
- 11. Training methodology: datasets, pre-processing, data quality criteria
- 12. Validation and testing: test sets, metrics, benchmark results, edge case results
- 13. Performance on foreseeable risks: documented risk evaluation results
- 14. Monitoring, functioning, control: logging specification, human oversight mechanisms
- 15. Cybersecurity baseline: threat model, security controls, penetration test results
- 16. Changes log: version history with change descriptions and re-assessment triggers
- 17. EU Declaration of Conformity draft: confirm structure matches Art.47(3)
Conformity Assessment Process — Annex VI (Items 18–24)
- 18. Annex VI checklist: verify all eight elements of internal control procedure are documented
- 19. Risk management procedure: Art.9 compliance documented with evidence
- 20. Data governance procedure: Art.10 compliance documented with dataset cards
- 21. Transparency procedure: Art.13 instructions for use drafted and reviewed
- 22. Human oversight procedure: Art.14 mechanisms tested and documented
- 23. Accuracy/robustness procedure: Art.15 metrics defined and baseline results recorded
- 24. Logging procedure: Art.12 capability confirmed and log samples retained
Infrastructure Jurisdiction (Items 25–28)
- 25. Document infrastructure provider, parent company incorporation jurisdiction
- 26. Assess CLOUD Act applicability to logging and training data infrastructure
- 27. Document MSA data access pathway (what authority can access logs, via what process)
- 28. If CLOUD Act applies: document mitigation strategy or migration plan
Final Steps (Items 29–30)
- 29. Sign and date the EU Declaration of Conformity (Art.47)
- 30. Affix CE marking and register in EU AI Database if Annex III system is in high-risk category requiring registration (Art.49)
When Harmonised Standards Arrive: Plan for Transition
When JTC 21 standards eventually receive OJ references (likely 2027–2028 for core Arts.9–15 standards), Art.40 presumption will activate. At that point:
- Systems with existing Annex VI internal control records can add a supplementary harmonised standards compliance record
- If the standard covers a requirement more rigorously than your internal control evidence, you update the evidence — but you do not restart the assessment
- The Declaration of Conformity (Art.47) can be amended to reference the newly applicable harmonised standards
This means well-constructed Annex VI documentation now creates a future-proof foundation. The work is not wasted when harmonised standards arrive — it becomes the baseline that the standard's conformity presumption supplements.
Summary
The EU AI Act's harmonised standards programme under mandate M/606 will not produce OJ-referenced standards before August 2026. Art.40 presumption of conformity is not available. Art.41 common specifications have not been adopted. Every Annex III high-risk AI provider faces full Annex VI internal control assessment — without shortcuts.
What this means in practice:
- Your conformity assessment evidence must stand on its own merits, not on harmonised standard compliance
- Technical documentation quality and completeness becomes the primary compliance signal
- Infrastructure jurisdiction (CLOUD Act exposure for logging and oversight data) must be explicitly addressed
- The absence of harmonised standards does not reduce the requirements — it removes the presumption mechanism
Start building your Annex VI record now. The 30-item checklist above covers the minimum evidence base. August 2, 2026 is 97 days away. Internal control assessment for a typical Annex III system takes 6–12 weeks when done properly — which means the window for a complete first-pass assessment is already closing.
Conducting an EU AI Act conformity assessment for your high-risk AI system? sota.io provides EU-native infrastructure with no US parent company — eliminating CLOUD Act exposure for logging data and oversight mechanisms before your MSA access documentation is written.