2026-04-12·13 min read·sota.io team

EU AI Act Article 16: The Complete Provider Obligations Checklist for High-Risk AI (2026)

You have used Article 6 to determine that your AI system is high-risk. Now what? Article 16 of the EU AI Act is the answer — a hub article that aggregates every obligation applicable to high-risk AI providers into a single enumerated list. It does not establish requirements itself. Instead, it points to nine other articles that do.

This design is intentional. Article 16 functions as the provider's compliance entry point — the article you read first when you discover your system is high-risk, and the checklist you return to before market placement to verify nothing was missed.

The obligations activate on August 2, 2026, when Chapter III of the EU AI Act begins applying to new high-risk AI systems placed on the EU market or put into service in the EU.


The Nine Provider Obligations Under Article 16

Article 16 enumerates the following obligations for every provider of a high-risk AI system:

(a) Comply with Chapter III Section 2 Requirements — Arts 9–15

The primary obligation: the AI system must satisfy all technical requirements in Chapter III Section 2. These are:

None of these is optional. All seven must be satisfied before market placement. A system that meets six of the seven requirements is not compliant.

(b) Have a Quality Management System — Art.17

Providers must implement a quality management system (QMS) covering the entire lifecycle of the high-risk AI system. The QMS must be documented, proportionate to the provider's size and sector, and cover:

For SMEs and individual developers, the Commission provides implementation guidelines under Art.96 that acknowledge proportionality. A large enterprise QMS and a startup QMS will look different — but both must exist.

Key point: The QMS is not a one-time document. It must be maintained and updated as the system evolves. Auditors will look for evidence that the QMS is a living operational process, not a static PDF filed and forgotten.

(c) Draw Up Technical Documentation — Art.11 + Annex IV

Technical documentation must be completed before placing the system on the market and kept up to date throughout the system lifecycle. Annex IV specifies eight mandatory sections:

  1. General description (purpose, intended use, version history)
  2. Detailed description of development elements (training data, algorithms, architecture)
  3. Monitoring, functioning, and control procedures
  4. Risk management documentation (per Art.9)
  5. Changes and post-certification modifications
  6. Standards and harmonized specifications applied
  7. Conformity assessment and external body information (where applicable)
  8. EU declaration of conformity

The documentation must be maintained for 10 years after market placement (Art.18). For embedded high-risk AI in regulated products, the timeline aligns with that product's existing documentation obligations.

CLOUD Act implication: Technical documentation stored on US-jurisdiction cloud infrastructure (AWS, Azure, GCP) is discoverable under CLOUD Act warrants. If your Annex IV documentation contains trade secrets, security architecture details, or proprietary model specifications, US-jurisdiction storage creates discovery risk. EU-native storage eliminates this exposure without affecting compliance.

(d) Keep Automatically Generated Logs — Art.12 + Art.19

The system must generate and retain automatic logs of its operation. Art.12 establishes the technical requirement (the system must be designed to generate logs). Art.19 establishes the provider's obligation to retain them.

Minimum log retention: 6 months (Art.19(1)), unless sector-specific regulation requires longer. For biometric AI systems, logs must be kept for a period appropriate to the purpose — interpreted as the duration of any pending legal proceedings or administrative actions.

Log contents must capture:

(e) Undergo Conformity Assessment — Art.43

Before placing the system on the market or putting it into service, providers must complete the applicable conformity assessment procedure. Two pathways exist:

Internal control (Annex VI): Available for most Annex III categories. The provider conducts the assessment internally, documents the process, and generates the EU declaration of conformity. No third party is required.

Third-party conformity assessment (Annex VII): Mandatory for biometric AI systems in Annex III Category 1 — remote biometric identification, biometric categorization, emotion recognition in regulated contexts. A notified body must conduct or participate in the assessment.

Conformity assessment is not a checkbox — it must be redone when substantial changes are made to the system (Art.43(4)). A version update that affects the system's intended purpose, risk profile, or technical architecture triggers re-assessment.

(f) Register in the EU Database Before Placement — Art.49

High-risk AI systems must be registered in the EU AI Act database before being placed on the market or put into service in the EU. Registration is not post-market notification — it is a pre-condition for market entry.

The EU database registration requires:

For embedded systems (high-risk AI in Annex II regulated products), registration requirements may be fulfilled as part of the product registration. Standalone Annex III systems register independently.

(g) Take Corrective Actions — Art.20

Providers must actively monitor deployed systems and take corrective action when the system is found not to conform to requirements. Art.20 creates an affirmative duty — providers cannot wait for regulators to identify non-conformity.

Corrective action obligations include:

Practical implication: A CI/CD pipeline that deploys model updates without a conformity check is a corrective action liability. Every update that affects the system's behavior in scope of its high-risk use must be evaluated for compliance impact before deployment.

(h) Affix CE Marking — Art.49

High-risk AI systems must bear the CE marking before being placed on the EU market. CE marking certifies that the system conforms to all applicable requirements — not just the EU AI Act, but all EU legislation that applies to the product.

For AI systems embedded in Annex II products (machinery, medical devices, vehicles), the CE marking follows the existing product CE marking process, with the AI Act requirements added as an additional compliance layer.

CE marking must be affixed to the AI system or its packaging and documentation in a visible, legible, and indelible manner. For software-only systems, CE marking appears in accompanying documentation and the instructions for use.

(i) Draw Up EU Declaration of Conformity — Art.48

The EU declaration of conformity (DoC) is a formal document in which the provider declares that the high-risk AI system conforms to all applicable requirements under the EU AI Act. It must be signed before the system is placed on the market.

Required DoC contents:

The DoC must be drawn up in one of the official EU languages. It must be kept for 10 years from market placement and made available to national authorities on request.


Pre-Market vs Post-Market: The Two-Phase Obligation Split

Understanding which Article 16 obligations are pre-market prerequisites versus ongoing post-market duties is critical for planning compliance timelines.

Pre-Market (must complete before first sale or deployment)

ObligationArticleAction Required
Technical requirementsArt.9–15Design and validate system against all 7 requirements
QMS establishedArt.17Document and implement quality management system
Technical documentationArt.11 + Annex IVComplete all 8 Annex IV sections
Conformity assessmentArt.43Internal control or third-party assessment
EU declaration of conformityArt.48Sign and date declaration
CE markingArt.49Affix to system/documentation
EU database registrationArt.49Register before market placement

If any of these is incomplete, market placement is unlawful.

Post-Market (ongoing obligations after deployment)

ObligationArticleAction Required
Log retentionArt.19Retain auto-generated logs (min. 6 months)
Post-market monitoringArt.30Implement monitoring plan
Corrective actionArt.20Identify and address non-conformity
Serious incident reportingArt.73Report to authorities within 15 days
Cooperation with authoritiesArt.21Provide documentation and access on request

Who Is a "Provider" Under Art.16?

The definition of provider in Art.3(3) is broader than most developers assume:

"'provider' means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark"

This creates three distinct provider categories:

1. The developer who builds and sells. The company that creates a high-risk AI product and sells it to other businesses. Clear provider status.

2. The integrator who fine-tunes. An enterprise that takes a foundation model (GPT-4, Claude, Gemini) and fine-tunes it for a high-risk use case — HR screening, credit scoring, medical triage — is a provider for the resulting system. The base model vendor's compliance does not transfer.

3. The deployer who becomes a provider. When a deployer "makes substantial modifications" to a high-risk AI system, they become the provider of the modified system (Art.25(1)(a)). A substantial modification includes changes to intended purpose, changes that affect risk level, or changes requiring new conformity assessment.

What substantial modification is not: routine parameter updates within declared performance envelopes, bug fixes that do not affect system behavior, minor UI changes not affecting AI decision logic.


Non-EU Providers: The Authorized Representative Requirement

Providers established outside the EU must appoint an EU-based authorized representative before any market activity in the EU (Art.22). The authorized representative:

Practically: if you are a US, UK, or Canadian AI provider selling into the EU market, you need an EU legal entity or a formal authorized representative arrangement before your system can lawfully enter the EU market.

The authorized representative is not a compliance consultant. They must be empowered to make corrective action decisions and communicate with regulators — and they need the technical documentation to do so.


Supply Chain Liability: When Component Providers Become Obligated

Art.16 applies to the provider of the high-risk AI system. But the supply chain creates secondary obligations through Art.25:

Scenario 1 — Known high-risk use: You sell an AI component (image classification model, NLP pipeline, decision engine). Your customer integrates it into a high-risk application. You knew — or reasonably should have known — about the high-risk intended use. Art.25 analysis begins.

Scenario 2 — Contractual obligation shift: Providers and upstream vendors can contractually agree to shift certain Art.16 obligations upward. The Commission is expected to publish template contractual clauses for these arrangements. Until then, bespoke contractual structures are used.

Scenario 3 — White-label arrangements: If you develop a high-risk AI system that a customer places on the market under their own name, the customer is the provider under Art.3(3). But if you have actual control over the system design and the customer is purely distributing, regulators may look through the arrangement.

Developer implication: Document intended use in every B2B contract. If your component could reasonably be used in a high-risk application, either restrict the use contractually or ensure your technical documentation supports integration into a high-risk system.


Python Compliance Tooling

The following implementation demonstrates a provider obligation tracker:

from dataclasses import dataclass, field
from enum import Enum
from typing import List, Optional


class ObligationPhase(Enum):
    PRE_MARKET = "pre_market"
    POST_MARKET = "post_market"
    BOTH = "both"


class ObligationStatus(Enum):
    NOT_STARTED = "not_started"
    IN_PROGRESS = "in_progress"
    COMPLETE = "complete"
    BLOCKED = "blocked"


@dataclass
class ProviderObligation:
    name: str
    article: str
    phase: ObligationPhase
    status: ObligationStatus = ObligationStatus.NOT_STARTED
    evidence: List[str] = field(default_factory=list)
    notes: str = ""

    def is_blocking_market_placement(self) -> bool:
        return (
            self.phase in (ObligationPhase.PRE_MARKET, ObligationPhase.BOTH)
            and self.status != ObligationStatus.COMPLETE
        )


class Article16ComplianceTracker:
    """Tracks provider compliance with all Art.16 obligations."""

    def __init__(self, provider_name: str, system_name: str):
        self.provider_name = provider_name
        self.system_name = system_name
        self.obligations: List[ProviderObligation] = self._initialize_obligations()

    def _initialize_obligations(self) -> List[ProviderObligation]:
        return [
            ProviderObligation(
                "Technical Requirements (Arts 9-15)",
                "Art.9-15",
                ObligationPhase.PRE_MARKET,
            ),
            ProviderObligation(
                "Quality Management System",
                "Art.17",
                ObligationPhase.PRE_MARKET,
            ),
            ProviderObligation(
                "Technical Documentation (Annex IV)",
                "Art.11",
                ObligationPhase.BOTH,
            ),
            ProviderObligation(
                "Automatic Logging Capability",
                "Art.12",
                ObligationPhase.PRE_MARKET,
            ),
            ProviderObligation(
                "Conformity Assessment",
                "Art.43",
                ObligationPhase.PRE_MARKET,
            ),
            ProviderObligation(
                "EU Declaration of Conformity",
                "Art.48",
                ObligationPhase.PRE_MARKET,
            ),
            ProviderObligation(
                "CE Marking",
                "Art.49",
                ObligationPhase.PRE_MARKET,
            ),
            ProviderObligation(
                "EU Database Registration",
                "Art.49",
                ObligationPhase.PRE_MARKET,
            ),
            ProviderObligation(
                "Post-Market Monitoring Plan",
                "Art.30",
                ObligationPhase.POST_MARKET,
            ),
            ProviderObligation(
                "Log Retention (min. 6 months)",
                "Art.19",
                ObligationPhase.POST_MARKET,
            ),
            ProviderObligation(
                "Corrective Action Procedures",
                "Art.20",
                ObligationPhase.BOTH,
            ),
        ]

    def market_placement_cleared(self) -> bool:
        """Returns True only when all pre-market obligations are complete."""
        return not any(
            o.is_blocking_market_placement() for o in self.obligations
        )

    def blocking_obligations(self) -> List[ProviderObligation]:
        return [o for o in self.obligations if o.is_blocking_market_placement()]

    def update_status(
        self,
        obligation_name: str,
        status: ObligationStatus,
        evidence: Optional[List[str]] = None,
        notes: str = "",
    ) -> None:
        for ob in self.obligations:
            if ob.name == obligation_name:
                ob.status = status
                if evidence:
                    ob.evidence.extend(evidence)
                ob.notes = notes
                return
        raise ValueError(f"Obligation '{obligation_name}' not found")

    def compliance_report(self) -> dict:
        total = len(self.obligations)
        complete = sum(1 for o in self.obligations if o.status == ObligationStatus.COMPLETE)
        blocking = self.blocking_obligations()

        return {
            "provider": self.provider_name,
            "system": self.system_name,
            "total_obligations": total,
            "complete": complete,
            "completion_rate": f"{(complete/total)*100:.0f}%",
            "market_placement_cleared": self.market_placement_cleared(),
            "blocking_count": len(blocking),
            "blocking_obligations": [
                {"name": o.name, "article": o.article}
                for o in blocking
            ],
        }


# Usage
tracker = Article16ComplianceTracker(
    provider_name="Acme AI GmbH",
    system_name="CreditScoreAI v2.1",
)

tracker.update_status(
    "Technical Requirements (Arts 9-15)",
    ObligationStatus.COMPLETE,
    evidence=["risk_mgmt_system_v2.pdf", "data_governance_policy.pdf"],
    notes="Completed by compliance team 2026-07-01. Formal verification via TLA+ for Art.9.",
)

tracker.update_status(
    "Quality Management System",
    ObligationStatus.IN_PROGRESS,
    notes="ISO 9001 baseline being extended to cover AI Act Art.17 requirements.",
)

report = tracker.compliance_report()
print(f"Market placement cleared: {report['market_placement_cleared']}")
print(f"Blocking obligations: {report['blocking_count']}")
for b in report["blocking_obligations"]:
    print(f"  - {b['name']} ({b['article']})")

30-Item Provider Readiness Checklist

Section 1: Provider Identity and Classification (Items 1–6)

Section 2: Technical Requirements (Art.9–15) (Items 7–12)

Section 3: Quality Management and Documentation (Items 13–18)

Section 4: Conformity Assessment and Marking (Items 19–24)

Section 5: Post-Market and Corrective Action (Items 25–30)


Five Mistakes Providers Make with Article 16

Mistake 1: Reading Art.16 as the Requirement

Art.16 lists obligations but defines none of them. Providers who read only Art.16 and conclude they "understand" the requirements have missed the substance. Every Art.16 item is a pointer — you must read the referenced article to understand what is actually required.

Mistake 2: Sequential Execution of Pre-Market Obligations

The seven pre-market obligations are interdependent, not sequential. The conformity assessment (Art.43) validates that Arts 9-15 requirements are met — which means Arts 9-15 work must be complete before Art.43 can begin. Technical documentation (Art.11) records the output of Arts 9-15 work. The correct execution order is: Arts 9-15 in parallel → technical documentation capturing outputs → conformity assessment → declaration → CE marking → registration.

Mistake 3: Treating EU Database Registration as Post-Market

Art.49 registration is a pre-market prerequisite. The system cannot lawfully be placed on the EU market before registration. Providers who plan for post-market registration are planning to be non-compliant at launch.

Mistake 4: Neglecting Substantial Modification Triggers

A model update that changes output accuracy by 5%, shifts the confidence distribution, or extends the system's intended purpose to a new Annex III category may constitute a "substantial modification" requiring a new conformity assessment. Without a formal change control process that evaluates this trigger, providers are exposed to cumulative drift into non-conformity.

Mistake 5: Assuming Deployer Obligations Transfer Compliance

When you supply a high-risk AI system to a deployer, the deployer takes on obligations under Art.26. This does not reduce your Art.16 obligations. Both provider and deployer have separate, concurrent obligations. A deployer's failure to comply with Art.26 does not create a defense for a provider's failure to comply with Art.16.


Key Dates


Summary

Article 16 is not where you find the requirements — it is where you find the map. Nine obligations, each pointing to a separate article, together covering the complete lifecycle of a high-risk AI system from design through post-market monitoring.

The critical insight: all pre-market obligations are conditions precedent to lawful market placement. There is no grace period, no provisional compliance, and no soft launch provision. A high-risk AI system that enters the EU market before completing all pre-market Art.16 obligations is unlawfully placed on the market — regardless of how close to compliant it is.

For non-EU providers, the authorized representative requirement adds an additional structural prerequisite. For systems with biometric capabilities in Annex III Category 1, mandatory third-party conformity assessment under Annex VII is a hard requirement that cannot be self-assessed.

The Python implementation above provides a foundation for building Art.16 compliance tracking into your development workflow. Treat the 30-item checklist as a gate — not a guide.


Related articles: Art.5 Prohibited Practices · Art.6 High-Risk Classification · Art.9 Risk Management · Art.17 Quality Management System · Art.43 Conformity Assessment