2026-04-24·11 min read·sota.io team

EU AI Act Art.64: Access to Data and Documentation — Market Surveillance Authority Powers for High-Risk AI Enforcement (2026)

EU AI Act Article 64 is the enforcement enabler of the Chapter IX governance framework. Where Articles 57–63 establish the institutional architecture — national competent authorities (Art.57), their investigation powers (Art.58), the AI Board (Art.59), the EU AI database (Art.60), the Scientific Panel (Art.61), the AI Office (Art.62), and the Advisory Forum (Art.63) — Art.64 provides the foundational legal authority without which none of those bodies could conduct meaningful conformity oversight: the right to access the data, documentation, and source code that constitute a high-risk AI system.

Conformity assessment for high-risk AI is not a documentation exercise. It requires direct inspection of the training datasets that shape model behaviour, the validation processes that establish system limits, the technical documentation that records design decisions, and — in cases where surface-level inspection is insufficient — the source code that implements those decisions. Art.64 is the statutory basis that converts market surveillance authority investigation rights (established in Art.58) into concrete access powers against which providers and deployers have corresponding cooperation obligations.

For developers and providers of high-risk AI systems — and for EU-sovereign infrastructure providers whose platforms host such systems — Art.64 defines exactly what regulators can demand, under what conditions they can demand it, what procedural safeguards limit that demand, and what obligations you carry to make that access possible.

Art.64 became applicable on 2 August 2025 as part of the phased entry into force of Regulation (EU) 2024/1689.


Art.64 in the Chapter IX Enforcement Architecture

Art.64 sits at the transition point between governance establishment and governance operation. The Chapter IX progression is:

ArticleFunctionRelationship to Art.64
Art.57NCA designation and independence requirementsArt.64 access powers are exercised by the Art.57-designated authorities
Art.58Market surveillance investigation powersArt.64 specifies the data/documentation access component of Art.58 powers
Art.59European AI BoardAI Board receives enforcement intelligence gathered under Art.64
Art.60EU AI databaseArt.64 access reveals whether database registrations are accurate
Art.61Scientific PanelScientific Panel uses Art.64-accessed data for technical expert evaluation
Art.62AI Office enforcement powers (GPAI)Art.62 AI Office corrective measures use Art.64 access in GPAI enforcement
Art.63Advisory ForumAdvisory Forum input shapes how Art.64 access powers are exercised in practice
Art.64Data and documentation access — the enforcement substrateEnables all preceding bodies to function

The critical point: Art.64 access rights extend to high-risk AI systems under Chapter III. The AI Office's parallel powers under Art.62 cover GPAI models under Chapter V. Together they establish comprehensive data access for the full scope of regulated AI systems.


Art.64(1): Full Access to Training Data, Test Data, and Technical Documentation

Art.64(1) grants market surveillance authorities full access to:

The word full in Art.64(1) is not qualified by proportionality at this stage. Proportionality governs source code access (Art.64(2)) and the overall investigation framework (Art.58), but access to datasets, documentation, and logs is granted as a right, not a conditional power.

Why datasets are the central enforcement target: Conformity with Chapter III, Section 2 requirements — data governance (Art.10), technical robustness (Art.15), human oversight (Art.14) — cannot be verified from product-level testing alone. An NCA assessing whether a high-risk AI system for recruitment screening (Annex III, category 4) meets Art.10 data quality requirements must be able to inspect:

None of this is visible from the deployed system's external behaviour. Art.64(1) access is what makes Art.10 compliance assessable.

Hosted infrastructure implications: For AI systems hosted on cloud platforms, Art.64(1) access extends to the provider's technical infrastructure to the extent that infrastructure holds regulated data. EU-native hosting arrangements simplify Art.64 compliance by maintaining training data, logs, and documentation within a single legal jurisdiction — eliminating the cross-border access complications that arise when data is distributed across non-EU infrastructure (see CLOUD Act section below).


Art.64(2): Source Code Access — Proportionality and Conditions

Art.64(2) grants market surveillance authorities access to source code of the high-risk AI system, subject to a proportionality constraint: source code access is available where necessary to assess conformity and where a reasoned request has been made.

The "where necessary" condition distinguishes Art.64(2) from Art.64(1). Source code access is not automatic in all enforcement proceedings. It requires the NCA to assess whether:

  1. The conformity question at issue cannot be answered by dataset inspection, documentation review, or output testing alone
  2. Source code inspection would materially advance the assessment
  3. The investigative burden is proportionate to the compliance risk identified

When source code access is necessary:

Conformity questionDataset/doc sufficient?Source code needed?
Data preprocessing pipeline meets Art.10(3)Usually yesRarely
Human oversight override mechanism (Art.14)Documentation may sufficeIf implementation contested
Robustness against adversarial input (Art.15)Test results + architecture docs often sufficientIf test methodology disputed
Logging completeness (Art.12)Log inspection + architecture docsIf logging gaps suspected
Embedded bias in inference pipelineTraining data + validationYes — pipeline logic required

Procedural requirements: The "reasoned request" requirement means the NCA must state in writing why source code access is necessary for the specific conformity question under investigation. This procedural requirement protects providers from speculative or overbroad source code demands while preserving NCA authority in genuinely complex cases.

Trade secrets: Source code is typically the most commercially sensitive component of an AI system. Art.64(2) must be read in conjunction with Art.64(5) (confidentiality) and Art.70 (trade secrets protection) — accessed source code carries the strongest confidentiality obligations among Art.64 material.


Art.64(3): Provider Cooperation Obligations

Art.64(3) establishes the corresponding obligation on the provider side: providers of high-risk AI systems and their authorised representatives under Art.22 shall cooperate with market surveillance authorities and take all necessary measures to ensure that Art.64 access is possible.

What cooperation requires:

Refusal consequences: Failure to cooperate with Art.64 access constitutes a standalone violation of the Regulation, separate from any underlying conformity issue. This means an NCA discovering a provider refused data access can initiate enforcement proceedings on two grounds simultaneously: suspected conformity failure and refusal to cooperate.

Notification triggers: Providers should treat the following as signals to initiate Art.64 compliance preparation:

Third-country providers: Providers established outside the EU are subject to Art.64 cooperation obligations through their EU-authorised representative (Art.22). The authorised representative must have the authority to ensure cooperation, which may require contractual mechanisms with the non-EU provider giving the representative enforceable access rights to production data and systems.


Art.64(4): Deployer Obligations to Facilitate Access

Art.64(4) extends cooperation obligations to deployers of high-risk AI systems. Deployers using systems covered by Art.64 must:

The deployer-facing Art.64(4) obligation reflects an enforcement reality: many conformity questions only become visible in the deployed environment. A high-risk AI system for medical diagnosis (Annex III, category 5) may produce anomalous outputs in a specific clinical workflow that was not represented in the provider's validation data. NCA investigation of that anomaly requires access to the deployer's production environment, real-world logs, and deployment-specific configuration — data the provider does not hold.

Deployer practical implications:


Art.64(5): Confidentiality and Trade Secret Protections

Art.64(5) places confidentiality obligations on market surveillance authorities with respect to information obtained under Art.64. Accessed information may only be used for enforcement purposes and must be protected in accordance with Union law on confidentiality of proceedings.

Three tiers of accessed information:

Information typeSensitivity levelProtections
Training data composition (structure, sources)MediumConfidential to investigation
Validation/test resultsMediumConfidential to investigation
Technical documentation (architecture, design choices)HighConfidential + trade secret
Source codeVery highStrongest protection — use/disclosure strictly limited

Art.70 interaction: Art.70 of the Regulation establishes a general framework for protection of confidential information and trade secrets in AI Act enforcement proceedings. Art.64(5) is the Art.64-specific instantiation of those protections: information obtained specifically through data/documentation access carries the Art.70 safeguards without requiring the provider to separately assert them.

Provider rights: Providers may designate specific datasets or code components as trade secrets at the time of access, triggering enhanced confidentiality protections and a higher threshold for any subsequent disclosure. This designation should be made in writing at the time the NCA first accesses the material — retroactive designation creates procedural complications.


Art.64 vs Art.62: Data Access for High-Risk AI vs GPAI Models

Art.64 and Art.62 establish parallel data access regimes for different AI system categories:

DimensionArt.64Art.62
ScopeHigh-risk AI systems (Chapter III, Annex III)GPAI models (Chapter V)
Access authorityNCAs (national market surveillance authorities)AI Office (Commission service)
Access triggersNCA investigation of system conformityAI Office investigation of model obligations
Source code accessConditional — "where necessary" + reasoned requestArt.62 investigation powers
Cooperation obligationProvider + deployer (Art.64(3)-(4))GPAI model provider (Art.62(1))
ConfidentialityArt.64(5) + Art.70Art.70

Dual-scope scenario: A GPAI model integrated into a high-risk AI system for credit scoring (Annex III, category 5b) may trigger both Art.64 (NCA access to credit-scoring system data) and Art.62 (AI Office access to underlying GPAI model). In this scenario:

EU-native infrastructure that maintains data within a single legal jurisdiction simplifies this dual-regulator scenario by ensuring both authorities can access data under EU legal framework — avoiding the jurisdictional complications of data distributed across US-headquartered cloud providers.


CLOUD Act Implications for Art.64 Data Access

The CLOUD Act (US Clarifying Lawful Overseas Use of Data Act) grants US government authorities the ability to compel US-incorporated cloud providers to produce data held overseas, including within the EU. This creates a structural tension with Art.64 enforcement integrity:

The dual-compellability risk:

Data locationEU jurisdictionUS jurisdiction
Training data on EU-incorporated cloudArt.64 NCA access ✓CLOUD Act risk: NO (EU-incorporated provider not subject)
Training data on US-incorporated cloud (EU datacenter)Art.64 NCA access ✓CLOUD Act risk: YES
Source code on US-incorporated repositoryArt.64 NCA access ✓CLOUD Act risk: YES
Logs on EU-native infrastructureArt.64 NCA access ✓CLOUD Act risk: NO

Enforcement credibility: A high-risk AI system whose training data is accessible to US government agencies under CLOUD Act compulsion — potentially before, during, or after an EU NCA investigation — creates uncertainty about data integrity. Evidence obtained by an NCA under Art.64 could theoretically be affected by prior US access. EU-incorporated infrastructure eliminates this concern by placing data outside CLOUD Act reach entirely.

Art.64 compliance recommendation: Providers targeting EU markets with high-risk AI systems should assess whether their data supply chain — training data storage, model artefact hosting, log retention — is exposed to CLOUD Act compellability. Where it is, migration to EU-incorporated infrastructure eliminates the risk without requiring any change to the AI system itself.


Python Implementation: DataAccessRequest and ProviderCooperationRecord

from dataclasses import dataclass, field
from datetime import date, datetime
from enum import Enum
from typing import Optional


class AccessType(str, Enum):
    TRAINING_DATA = "training_data"
    TEST_DATA = "test_data"
    VALIDATION_DATA = "validation_data"
    TECHNICAL_DOCUMENTATION = "technical_documentation"
    LOGS = "logs"
    SOURCE_CODE = "source_code"  # Art.64(2) — conditional
    POST_MARKET_MONITORING = "post_market_monitoring"


class CooperationStatus(str, Enum):
    PENDING = "pending"
    IN_PROGRESS = "in_progress"
    FULFILLED = "fulfilled"
    PARTIALLY_FULFILLED = "partially_fulfilled"
    REFUSED = "refused"  # triggers enforcement escalation


@dataclass
class DataAccessRequest:
    """Represents an NCA Art.64 data/documentation access request."""

    request_id: str
    requesting_authority: str  # NCA name or "AI Office" (Art.62)
    system_id: str  # High-risk AI system identifier
    request_date: date
    access_types: list[AccessType]
    legal_basis: str = "Art.64 Regulation (EU) 2024/1689"
    source_code_reason: Optional[str] = None  # Required if SOURCE_CODE in access_types
    confidentiality_designated_items: list[str] = field(default_factory=list)
    trade_secret_items: list[str] = field(default_factory=list)

    def requires_reasoned_request(self) -> bool:
        """Art.64(2): source code access requires explicit reasoning."""
        return AccessType.SOURCE_CODE in self.access_types

    def validate(self) -> list[str]:
        issues = []
        if AccessType.SOURCE_CODE in self.access_types and not self.source_code_reason:
            issues.append(
                "Art.64(2): source_code_reason required when requesting source code access"
            )
        return issues

    def confidentiality_designation_summary(self) -> str:
        if not self.confidentiality_designated_items and not self.trade_secret_items:
            return "No items designated confidential or trade secret."
        lines = ["Art.64(5) + Art.70 designations:"]
        for item in self.confidentiality_designated_items:
            lines.append(f"  [CONFIDENTIAL] {item}")
        for item in self.trade_secret_items:
            lines.append(f"  [TRADE SECRET] {item}")
        return "\n".join(lines)


@dataclass
class ProviderCooperationRecord:
    """Tracks provider compliance with Art.64(3) cooperation obligation."""

    record_id: str
    request: DataAccessRequest
    provider_entity: str
    authorised_representative: Optional[str] = None  # Art.22 representative
    response_date: Optional[date] = None
    status: CooperationStatus = CooperationStatus.PENDING
    access_provided: list[AccessType] = field(default_factory=list)
    access_refused: list[AccessType] = field(default_factory=list)
    refusal_reasons: dict[AccessType, str] = field(default_factory=dict)
    notes: str = ""

    def is_compliant(self) -> bool:
        """Full compliance: all requested access types fulfilled without refusal."""
        return (
            self.status == CooperationStatus.FULFILLED
            and len(self.access_refused) == 0
        )

    def enforcement_risk(self) -> str:
        if self.status == CooperationStatus.REFUSED:
            return "HIGH — refusal of Art.64 access is standalone violation"
        if self.access_refused:
            return f"MEDIUM — partial refusal: {[a.value for a in self.access_refused]}"
        if self.status == CooperationStatus.PENDING and self.request.request_date:
            days_open = (date.today() - self.request.request_date).days
            if days_open > 30:
                return f"MEDIUM — request open {days_open} days without response"
        return "LOW"

    def cooperation_report(self) -> str:
        lines = [
            f"Art.64 Cooperation Record — {self.record_id}",
            f"Authority: {self.request.requesting_authority}",
            f"System: {self.request.system_id}",
            f"Request date: {self.request.request_date}",
            f"Status: {self.status.value}",
            f"Enforcement risk: {self.enforcement_risk()}",
        ]
        if self.access_provided:
            lines.append(f"Access provided: {[a.value for a in self.access_provided]}")
        if self.access_refused:
            lines.append(f"Access refused: {[a.value for a in self.access_refused]}")
        return "\n".join(lines)

Art.64 Compliance Readiness Checklist

#ItemWhoTiming
1Inventory all training, test, and validation datasets by high-risk system — confirm retrievability within 5 business daysProviderBefore market placement
2Confirm technical documentation (Art.11 + Annex IV) is complete and retrievable in the deployment Member State's languageProviderBefore market placement
3Designate a technical point of contact for NCA Art.64 access requests — include contact in conformity declarationProviderBefore market placement
4Assess source code CLOUD Act exposure — if source code is on US-incorporated repositories, evaluate EU-native migrationProviderBefore market placement
5Confirm Art.22 authorised representative has contractual authority to ensure provider cooperation with NCA data accessProvider (third-country)Before EU market access
6Verify deployment service agreements include provisions for NCA investigation access to logs and deployment-environment dataDeployerBefore deployment
7Implement trade secret designation process: prepare written designation templates for source code, proprietary datasetsProviderBefore first potential investigation
8Confirm post-market monitoring data (Art.72) is retained for minimum period consistent with NCA investigation timelinesDeployerOngoing
9Review NCA investigation notification procedures — confirm internal escalation path from notification to Art.64 cooperation teamProvider + deployerBefore market placement
10Conduct annual Art.64 readiness drill: simulate NCA data request, time retrieval of all Art.64(1) materials, identify gapsProviderAnnually

Series Context: Chapter VI Governance Framework

ArticleCoveragePost
Art.57National Competent Authorities — designation, tasks, independenceArt.57 guide
Art.58NCA enforcement powers — investigation, access, corrective measuresArt.58 guide
Art.59AI Board — composition, independence, NCA coordinationArt.59 guide
Art.60EU AI database — public registry, EUID governance, Commission managementArt.60 guide
Art.61Scientific Panel — independent experts, model evaluation, AI Office advisoryArt.61 guide
Art.62AI Office enforcement powers — corrective measures, market withdrawal, emergency actionArt.62 guide
Art.63Advisory Forum — multi-stakeholder consultation, composition, tasks, CoP inputArt.63 guide
Art.64Access to data and documentation — market surveillance authority enforcement powersThis guide
Art.65Reporting of serious incidents and malfunctioning of high-risk AI systemsArt.65 guide

EU AI Act Art.64 analysis based on Regulation (EU) 2024/1689 as published in the Official Journal of the European Union. Applicable from 2 August 2025 per Art.113(3). NCA-specific procedures for exercising Art.64 access powers will be established through national implementing measures; Member States may impose additional procedural requirements consistent with the Regulation. This guide reflects the text of the Regulation as enacted.