EU AI Act Article 16: The Complete Provider Obligations Checklist for High-Risk AI (2026)
You have used Article 6 to determine that your AI system is high-risk. Now what? Article 16 of the EU AI Act is the answer — a hub article that aggregates every obligation applicable to high-risk AI providers into a single enumerated list. It does not establish requirements itself. Instead, it points to nine other articles that do.
This design is intentional. Article 16 functions as the provider's compliance entry point — the article you read first when you discover your system is high-risk, and the checklist you return to before market placement to verify nothing was missed.
The obligations activate on August 2, 2026, when Chapter III of the EU AI Act begins applying to new high-risk AI systems placed on the EU market or put into service in the EU.
The Nine Provider Obligations Under Article 16
Article 16 enumerates the following obligations for every provider of a high-risk AI system:
(a) Comply with Chapter III Section 2 Requirements — Arts 9–15
The primary obligation: the AI system must satisfy all technical requirements in Chapter III Section 2. These are:
- Art.9: Risk management system — continuous, documented, lifecycle-wide
- Art.10: Training, validation, and testing data requirements — data governance, bias management, EU-relevant datasets
- Art.11: Technical documentation per Annex IV — 8 required sections
- Art.12: Automatic logging capability — events, anomalies, human oversight triggers
- Art.13: Transparency and information provision to deployers — instructions for use
- Art.14: Human oversight — technical design enabling meaningful intervention
- Art.15: Accuracy, robustness, and cybersecurity — declared metrics, adversarial testing
None of these is optional. All seven must be satisfied before market placement. A system that meets six of the seven requirements is not compliant.
(b) Have a Quality Management System — Art.17
Providers must implement a quality management system (QMS) covering the entire lifecycle of the high-risk AI system. The QMS must be documented, proportionate to the provider's size and sector, and cover:
- Regulatory compliance strategy and approach
- Data management procedures
- Risk management integration
- Technical documentation procedures
- Conformity assessment process
- Change management and update procedures
- Post-market monitoring plan
For SMEs and individual developers, the Commission provides implementation guidelines under Art.96 that acknowledge proportionality. A large enterprise QMS and a startup QMS will look different — but both must exist.
Key point: The QMS is not a one-time document. It must be maintained and updated as the system evolves. Auditors will look for evidence that the QMS is a living operational process, not a static PDF filed and forgotten.
(c) Draw Up Technical Documentation — Art.11 + Annex IV
Technical documentation must be completed before placing the system on the market and kept up to date throughout the system lifecycle. Annex IV specifies eight mandatory sections:
- General description (purpose, intended use, version history)
- Detailed description of development elements (training data, algorithms, architecture)
- Monitoring, functioning, and control procedures
- Risk management documentation (per Art.9)
- Changes and post-certification modifications
- Standards and harmonized specifications applied
- Conformity assessment and external body information (where applicable)
- EU declaration of conformity
The documentation must be maintained for 10 years after market placement (Art.18). For embedded high-risk AI in regulated products, the timeline aligns with that product's existing documentation obligations.
CLOUD Act implication: Technical documentation stored on US-jurisdiction cloud infrastructure (AWS, Azure, GCP) is discoverable under CLOUD Act warrants. If your Annex IV documentation contains trade secrets, security architecture details, or proprietary model specifications, US-jurisdiction storage creates discovery risk. EU-native storage eliminates this exposure without affecting compliance.
(d) Keep Automatically Generated Logs — Art.12 + Art.19
The system must generate and retain automatic logs of its operation. Art.12 establishes the technical requirement (the system must be designed to generate logs). Art.19 establishes the provider's obligation to retain them.
Minimum log retention: 6 months (Art.19(1)), unless sector-specific regulation requires longer. For biometric AI systems, logs must be kept for a period appropriate to the purpose — interpreted as the duration of any pending legal proceedings or administrative actions.
Log contents must capture:
- System activity periods
- Reference database queries (for biometric systems)
- Verification of input data
- Results output including confidence indicators
- Human oversight interventions
- System anomalies
(e) Undergo Conformity Assessment — Art.43
Before placing the system on the market or putting it into service, providers must complete the applicable conformity assessment procedure. Two pathways exist:
Internal control (Annex VI): Available for most Annex III categories. The provider conducts the assessment internally, documents the process, and generates the EU declaration of conformity. No third party is required.
Third-party conformity assessment (Annex VII): Mandatory for biometric AI systems in Annex III Category 1 — remote biometric identification, biometric categorization, emotion recognition in regulated contexts. A notified body must conduct or participate in the assessment.
Conformity assessment is not a checkbox — it must be redone when substantial changes are made to the system (Art.43(4)). A version update that affects the system's intended purpose, risk profile, or technical architecture triggers re-assessment.
(f) Register in the EU Database Before Placement — Art.49
High-risk AI systems must be registered in the EU AI Act database before being placed on the market or put into service in the EU. Registration is not post-market notification — it is a pre-condition for market entry.
The EU database registration requires:
- Provider name and contact details
- System name and version
- Intended purpose and category under Annex III
- Countries of intended deployment
- Serious incident data (once available post-deployment)
For embedded systems (high-risk AI in Annex II regulated products), registration requirements may be fulfilled as part of the product registration. Standalone Annex III systems register independently.
(g) Take Corrective Actions — Art.20
Providers must actively monitor deployed systems and take corrective action when the system is found not to conform to requirements. Art.20 creates an affirmative duty — providers cannot wait for regulators to identify non-conformity.
Corrective action obligations include:
- Immediately taking the system off the market or restricting its use if it poses a risk
- Informing deployers and importers of the non-conformity and corrective measures
- Informing national market surveillance authorities of serious incidents within 15 days (Art.73)
- Cooperating with authorities investigating the system (Art.21)
Practical implication: A CI/CD pipeline that deploys model updates without a conformity check is a corrective action liability. Every update that affects the system's behavior in scope of its high-risk use must be evaluated for compliance impact before deployment.
(h) Affix CE Marking — Art.49
High-risk AI systems must bear the CE marking before being placed on the EU market. CE marking certifies that the system conforms to all applicable requirements — not just the EU AI Act, but all EU legislation that applies to the product.
For AI systems embedded in Annex II products (machinery, medical devices, vehicles), the CE marking follows the existing product CE marking process, with the AI Act requirements added as an additional compliance layer.
CE marking must be affixed to the AI system or its packaging and documentation in a visible, legible, and indelible manner. For software-only systems, CE marking appears in accompanying documentation and the instructions for use.
(i) Draw Up EU Declaration of Conformity — Art.48
The EU declaration of conformity (DoC) is a formal document in which the provider declares that the high-risk AI system conforms to all applicable requirements under the EU AI Act. It must be signed before the system is placed on the market.
Required DoC contents:
- Provider name and address
- System name and version
- Statement of conformity with all relevant legislation
- Standards and harmonized specifications applied
- Notified body identification number (where applicable)
- Date and place of issue
- Provider representative signature
The DoC must be drawn up in one of the official EU languages. It must be kept for 10 years from market placement and made available to national authorities on request.
Pre-Market vs Post-Market: The Two-Phase Obligation Split
Understanding which Article 16 obligations are pre-market prerequisites versus ongoing post-market duties is critical for planning compliance timelines.
Pre-Market (must complete before first sale or deployment)
| Obligation | Article | Action Required |
|---|---|---|
| Technical requirements | Art.9–15 | Design and validate system against all 7 requirements |
| QMS established | Art.17 | Document and implement quality management system |
| Technical documentation | Art.11 + Annex IV | Complete all 8 Annex IV sections |
| Conformity assessment | Art.43 | Internal control or third-party assessment |
| EU declaration of conformity | Art.48 | Sign and date declaration |
| CE marking | Art.49 | Affix to system/documentation |
| EU database registration | Art.49 | Register before market placement |
If any of these is incomplete, market placement is unlawful.
Post-Market (ongoing obligations after deployment)
| Obligation | Article | Action Required |
|---|---|---|
| Log retention | Art.19 | Retain auto-generated logs (min. 6 months) |
| Post-market monitoring | Art.30 | Implement monitoring plan |
| Corrective action | Art.20 | Identify and address non-conformity |
| Serious incident reporting | Art.73 | Report to authorities within 15 days |
| Cooperation with authorities | Art.21 | Provide documentation and access on request |
Who Is a "Provider" Under Art.16?
The definition of provider in Art.3(3) is broader than most developers assume:
"'provider' means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark"
This creates three distinct provider categories:
1. The developer who builds and sells. The company that creates a high-risk AI product and sells it to other businesses. Clear provider status.
2. The integrator who fine-tunes. An enterprise that takes a foundation model (GPT-4, Claude, Gemini) and fine-tunes it for a high-risk use case — HR screening, credit scoring, medical triage — is a provider for the resulting system. The base model vendor's compliance does not transfer.
3. The deployer who becomes a provider. When a deployer "makes substantial modifications" to a high-risk AI system, they become the provider of the modified system (Art.25(1)(a)). A substantial modification includes changes to intended purpose, changes that affect risk level, or changes requiring new conformity assessment.
What substantial modification is not: routine parameter updates within declared performance envelopes, bug fixes that do not affect system behavior, minor UI changes not affecting AI decision logic.
Non-EU Providers: The Authorized Representative Requirement
Providers established outside the EU must appoint an EU-based authorized representative before any market activity in the EU (Art.22). The authorized representative:
- Registers the AI system in the EU database on the provider's behalf
- Is the contact point for EU market surveillance authorities
- Must have a written mandate from the provider specifying the scope of their authority
- Bears joint liability for non-conforming systems in some circumstances
Practically: if you are a US, UK, or Canadian AI provider selling into the EU market, you need an EU legal entity or a formal authorized representative arrangement before your system can lawfully enter the EU market.
The authorized representative is not a compliance consultant. They must be empowered to make corrective action decisions and communicate with regulators — and they need the technical documentation to do so.
Supply Chain Liability: When Component Providers Become Obligated
Art.16 applies to the provider of the high-risk AI system. But the supply chain creates secondary obligations through Art.25:
Scenario 1 — Known high-risk use: You sell an AI component (image classification model, NLP pipeline, decision engine). Your customer integrates it into a high-risk application. You knew — or reasonably should have known — about the high-risk intended use. Art.25 analysis begins.
Scenario 2 — Contractual obligation shift: Providers and upstream vendors can contractually agree to shift certain Art.16 obligations upward. The Commission is expected to publish template contractual clauses for these arrangements. Until then, bespoke contractual structures are used.
Scenario 3 — White-label arrangements: If you develop a high-risk AI system that a customer places on the market under their own name, the customer is the provider under Art.3(3). But if you have actual control over the system design and the customer is purely distributing, regulators may look through the arrangement.
Developer implication: Document intended use in every B2B contract. If your component could reasonably be used in a high-risk application, either restrict the use contractually or ensure your technical documentation supports integration into a high-risk system.
Python Compliance Tooling
The following implementation demonstrates a provider obligation tracker:
from dataclasses import dataclass, field
from enum import Enum
from typing import List, Optional
class ObligationPhase(Enum):
PRE_MARKET = "pre_market"
POST_MARKET = "post_market"
BOTH = "both"
class ObligationStatus(Enum):
NOT_STARTED = "not_started"
IN_PROGRESS = "in_progress"
COMPLETE = "complete"
BLOCKED = "blocked"
@dataclass
class ProviderObligation:
name: str
article: str
phase: ObligationPhase
status: ObligationStatus = ObligationStatus.NOT_STARTED
evidence: List[str] = field(default_factory=list)
notes: str = ""
def is_blocking_market_placement(self) -> bool:
return (
self.phase in (ObligationPhase.PRE_MARKET, ObligationPhase.BOTH)
and self.status != ObligationStatus.COMPLETE
)
class Article16ComplianceTracker:
"""Tracks provider compliance with all Art.16 obligations."""
def __init__(self, provider_name: str, system_name: str):
self.provider_name = provider_name
self.system_name = system_name
self.obligations: List[ProviderObligation] = self._initialize_obligations()
def _initialize_obligations(self) -> List[ProviderObligation]:
return [
ProviderObligation(
"Technical Requirements (Arts 9-15)",
"Art.9-15",
ObligationPhase.PRE_MARKET,
),
ProviderObligation(
"Quality Management System",
"Art.17",
ObligationPhase.PRE_MARKET,
),
ProviderObligation(
"Technical Documentation (Annex IV)",
"Art.11",
ObligationPhase.BOTH,
),
ProviderObligation(
"Automatic Logging Capability",
"Art.12",
ObligationPhase.PRE_MARKET,
),
ProviderObligation(
"Conformity Assessment",
"Art.43",
ObligationPhase.PRE_MARKET,
),
ProviderObligation(
"EU Declaration of Conformity",
"Art.48",
ObligationPhase.PRE_MARKET,
),
ProviderObligation(
"CE Marking",
"Art.49",
ObligationPhase.PRE_MARKET,
),
ProviderObligation(
"EU Database Registration",
"Art.49",
ObligationPhase.PRE_MARKET,
),
ProviderObligation(
"Post-Market Monitoring Plan",
"Art.30",
ObligationPhase.POST_MARKET,
),
ProviderObligation(
"Log Retention (min. 6 months)",
"Art.19",
ObligationPhase.POST_MARKET,
),
ProviderObligation(
"Corrective Action Procedures",
"Art.20",
ObligationPhase.BOTH,
),
]
def market_placement_cleared(self) -> bool:
"""Returns True only when all pre-market obligations are complete."""
return not any(
o.is_blocking_market_placement() for o in self.obligations
)
def blocking_obligations(self) -> List[ProviderObligation]:
return [o for o in self.obligations if o.is_blocking_market_placement()]
def update_status(
self,
obligation_name: str,
status: ObligationStatus,
evidence: Optional[List[str]] = None,
notes: str = "",
) -> None:
for ob in self.obligations:
if ob.name == obligation_name:
ob.status = status
if evidence:
ob.evidence.extend(evidence)
ob.notes = notes
return
raise ValueError(f"Obligation '{obligation_name}' not found")
def compliance_report(self) -> dict:
total = len(self.obligations)
complete = sum(1 for o in self.obligations if o.status == ObligationStatus.COMPLETE)
blocking = self.blocking_obligations()
return {
"provider": self.provider_name,
"system": self.system_name,
"total_obligations": total,
"complete": complete,
"completion_rate": f"{(complete/total)*100:.0f}%",
"market_placement_cleared": self.market_placement_cleared(),
"blocking_count": len(blocking),
"blocking_obligations": [
{"name": o.name, "article": o.article}
for o in blocking
],
}
# Usage
tracker = Article16ComplianceTracker(
provider_name="Acme AI GmbH",
system_name="CreditScoreAI v2.1",
)
tracker.update_status(
"Technical Requirements (Arts 9-15)",
ObligationStatus.COMPLETE,
evidence=["risk_mgmt_system_v2.pdf", "data_governance_policy.pdf"],
notes="Completed by compliance team 2026-07-01. Formal verification via TLA+ for Art.9.",
)
tracker.update_status(
"Quality Management System",
ObligationStatus.IN_PROGRESS,
notes="ISO 9001 baseline being extended to cover AI Act Art.17 requirements.",
)
report = tracker.compliance_report()
print(f"Market placement cleared: {report['market_placement_cleared']}")
print(f"Blocking obligations: {report['blocking_count']}")
for b in report["blocking_obligations"]:
print(f" - {b['name']} ({b['article']})")
30-Item Provider Readiness Checklist
Section 1: Provider Identity and Classification (Items 1–6)
- 1. Confirmed legal entity status as a "provider" under Art.3(3) (developer placing system on market OR under own brand)
- 2. Completed high-risk classification analysis under Art.6 — documented classification pathway (6(1) or 6(2)) and exclusion analysis (6(3))
- 3. If non-EU provider: appointed EU authorized representative with written mandate covering Art.22 scope
- 4. If non-EU provider: authorized representative has access to full technical documentation
- 5. Identified all downstream deployers who will use the system in high-risk applications
- 6. Reviewed supply chain for upstream components — assessed whether any upstream vendor has Art.25 obligations
Section 2: Technical Requirements (Art.9–15) (Items 7–12)
- 7. Risk management system operational per Art.9 — iterative, documented, lifecycle-integrated
- 8. Training, validation, and testing data governance documented per Art.10 — bias detection evidence
- 9. Technical documentation complete per Art.11 + Annex IV — all 8 sections drafted and reviewed
- 10. Automatic logging capability implemented per Art.12 — log format, retention, access controls defined
- 11. Instructions for use drafted per Art.13 — deployer-facing documentation with all required information
- 12. Human oversight capability technically implemented per Art.14 — override, pause, and monitor functions
Section 3: Quality Management and Documentation (Items 13–18)
- 13. QMS established per Art.17 — scope, procedures, and ownership documented
- 14. QMS proportionality assessed — whether SME guidance (Art.96) applies
- 15. Technical documentation review cycle established — who updates, when, upon what triggers
- 16. Change management process documented — what triggers re-assessment under Art.43(4)
- 17. Technical documentation storage assessed for CLOUD Act exposure — EU-native storage confirmed or risk accepted
- 18. 10-year documentation retention plan established per Art.18
Section 4: Conformity Assessment and Marking (Items 19–24)
- 19. Conformity assessment pathway determined — internal control (Annex VI) or third-party (Annex VII required for biometrics Cat.1)
- 20. Conformity assessment procedure completed and documented
- 21. EU declaration of conformity drafted, signed, and dated per Art.48
- 22. CE marking affixed to system/documentation in correct format
- 23. EU database registration submitted and confirmation received per Art.49
- 24. Registration data accuracy verified — intended purpose, Annex III category, deployment countries
Section 5: Post-Market and Corrective Action (Items 25–30)
- 25. Post-market monitoring plan established per Art.30 — metrics, thresholds, escalation paths
- 26. Log retention process operational — 6-month minimum, extended for regulated sectors
- 27. Corrective action procedures documented — decision tree for non-conformity identification and response
- 28. Serious incident reporting workflow established per Art.73 — 15-day notification timeline
- 29. Deployer notification process established — how to communicate non-conformity and corrective measures
- 30. Authority cooperation protocol established — who responds to MSA requests, documentation access procedure
Five Mistakes Providers Make with Article 16
Mistake 1: Reading Art.16 as the Requirement
Art.16 lists obligations but defines none of them. Providers who read only Art.16 and conclude they "understand" the requirements have missed the substance. Every Art.16 item is a pointer — you must read the referenced article to understand what is actually required.
Mistake 2: Sequential Execution of Pre-Market Obligations
The seven pre-market obligations are interdependent, not sequential. The conformity assessment (Art.43) validates that Arts 9-15 requirements are met — which means Arts 9-15 work must be complete before Art.43 can begin. Technical documentation (Art.11) records the output of Arts 9-15 work. The correct execution order is: Arts 9-15 in parallel → technical documentation capturing outputs → conformity assessment → declaration → CE marking → registration.
Mistake 3: Treating EU Database Registration as Post-Market
Art.49 registration is a pre-market prerequisite. The system cannot lawfully be placed on the EU market before registration. Providers who plan for post-market registration are planning to be non-compliant at launch.
Mistake 4: Neglecting Substantial Modification Triggers
A model update that changes output accuracy by 5%, shifts the confidence distribution, or extends the system's intended purpose to a new Annex III category may constitute a "substantial modification" requiring a new conformity assessment. Without a formal change control process that evaluates this trigger, providers are exposed to cumulative drift into non-conformity.
Mistake 5: Assuming Deployer Obligations Transfer Compliance
When you supply a high-risk AI system to a deployer, the deployer takes on obligations under Art.26. This does not reduce your Art.16 obligations. Both provider and deployer have separate, concurrent obligations. A deployer's failure to comply with Art.26 does not create a defense for a provider's failure to comply with Art.16.
Key Dates
- August 2, 2026: Chapter III, Section 2 obligations activate for new high-risk AI systems. All Art.16 pre-market requirements must be satisfied before this date for any system intending EU market entry after this date.
- August 2, 2027: Transitional period ends for high-risk AI systems that are components of Annex I regulated products (machinery, medical devices, vehicles, etc.).
- 2028+: Commission evaluates Annex III categories for potential expansion via delegated acts (Art.97). Additional categories could add new providers to the Art.16 obligation scope.
Summary
Article 16 is not where you find the requirements — it is where you find the map. Nine obligations, each pointing to a separate article, together covering the complete lifecycle of a high-risk AI system from design through post-market monitoring.
The critical insight: all pre-market obligations are conditions precedent to lawful market placement. There is no grace period, no provisional compliance, and no soft launch provision. A high-risk AI system that enters the EU market before completing all pre-market Art.16 obligations is unlawfully placed on the market — regardless of how close to compliant it is.
For non-EU providers, the authorized representative requirement adds an additional structural prerequisite. For systems with biometric capabilities in Annex III Category 1, mandatory third-party conformity assessment under Annex VII is a hard requirement that cannot be self-assessed.
The Python implementation above provides a foundation for building Art.16 compliance tracking into your development workflow. Treat the 30-item checklist as a gate — not a guide.
Related articles: Art.5 Prohibited Practices · Art.6 High-Risk Classification · Art.9 Risk Management · Art.17 Quality Management System · Art.43 Conformity Assessment