EU AI Act Art.17 Quality Management System: Developer Guide (QMS for High-Risk AI 2026)
EU AI Act Article 17 is the organisational backbone of the high-risk AI compliance framework: every provider of a high-risk AI system must establish, implement, document, and maintain a Quality Management System. Unlike Arts.9–15, which specify technical obligations at the system level, Art.17 operates at the organisation level — it governs the processes, governance structures, and documentation systems that ensure compliance is systematic rather than incidental.
This guide covers Art.17(1)-(2) QMS scope and the 8 mandatory elements, ISO/IEC 42001 (AI Management System) mapping as the EU-endorsed QMS standard, ISO 9001 integration paths, the SMB proportionality principle, the QMS × Art.9 risk management intersection, QMS × Art.72 post-market monitoring, CLOUD Act jurisdiction risk for 10-year QMS documentation retention, Python implementation including QMSComplianceChecker, QMSDocumentRegister, and Art17AuditTrail, and the 40-item Art.17 compliance checklist.
Art.17 in the High-Risk AI Compliance Chain
Art.17 is the process governance layer that sits above the technical obligations of Arts.9–15:
| Article | Obligation | Art.17 Interface |
|---|---|---|
| Art.9 | Risk Management System | Art.17(1)(b) requires a risk management plan; Art.9 is the technical standard for that plan |
| Art.10 | Training Data Governance | Art.17(1)(c) design verification covers training data quality gates |
| Art.11 | Technical Documentation | Art.17(2) QMS documentation constitutes part of the Annex IV technical dossier |
| Art.12 | Logging & Record-Keeping | Art.17 QMS processes generate Art.12 mandatory log events; QMS records are Art.12 records |
| Art.13 | Transparency & IFU | Art.17(1)(f) communication procedures produce the Art.13 instructions-for-use |
| Art.14 | Human Oversight | Art.17(1)(e) post-market monitoring feeds back into Art.14 oversight design updates |
| Art.15 | Accuracy, Robustness, Cybersecurity | Art.17(1)(h) recalibration procedure addresses Art.15 performance degradation |
| Art.17 | Quality Management System | This article |
| Art.72 | Post-Market Monitoring | Art.17(1)(e) explicitly includes PMM as a QMS element |
Art.17 is unique in the high-risk AI compliance arc because it is meta-compliance: it governs how all other compliance obligations are managed, evidenced, and maintained over time.
Art.17(1) — Scope: All High-Risk AI Providers
"Providers of high-risk AI systems shall put in place a quality management system that ensures compliance with this Regulation. That system shall be documented in a systematic and orderly manner in the form of written policies, procedures and instructions, and shall encompass at least the following aspects..."
Who must comply:
- All providers placing high-risk AI systems on the EU market (Annex III categories)
- Natural persons, legal persons, public authorities, and agencies
- Third-country providers whose systems are used in the EU (Art.2(1)(c))
Timeline:
- GPAI models: August 2025
- Annex III High-Risk AI systems: August 2026 (general)
- Annex I (safety components of regulated products): aligned with product regulation timelines
Scope of "documented": Art.17(1) specifies "written policies, procedures and instructions" — verbal or informal QMS does not satisfy the requirement. Documentation must be systematic and orderly, meaning a discoverable document management system, not ad-hoc files.
Art.17(2) — The 8 Mandatory QMS Elements
Art.17(1)(a)–(h) specifies the minimum elements. Each is a mandatory component of the written QMS.
Element (a) — Compliance Strategy
Requirement: A strategy for regulatory compliance including compliance with conformity assessment procedures and procedures for managing modifications to the high-risk AI system.
Implementation:
- Documented compliance roadmap covering all applicable Annex III obligations
- Change management procedure specifying when a system modification triggers re-assessment under Art.43
- Role assignments: who owns compliance decisions, who approves changes
- EU-native hosting angle: strategy must address hosting jurisdiction (EU-native = single GDPR/AI Act regime)
Documentation artefacts:
- Compliance strategy document (version-controlled)
- Change management SOP with AI Act modification triggers
- Responsible AI officer designation record
Element (b) — Risk Management Plan
Requirement: Techniques and procedures for the design of the high-risk AI system and the examination of the design of such systems.
Implementation:
- Art.9 risk management system is the technical instantiation of this element
- QMS element (b) is the procedural wrapper: who runs the risk assessment, when, how often, and how findings are escalated
- Design examination procedure: who reviews architectural decisions for compliance risk
- Risk acceptance criteria and escalation thresholds
Art.17 × Art.9 relationship: Art.9 specifies what risks must be managed (foreseeable misuse, reasonably foreseeable risks). Art.17(1)(b) specifies the organisational procedure for conducting that management. Both must be documented; Art.9 compliance without an Art.17(1)(b) procedure document is incomplete.
Element (c) — Design Verification and Validation
Requirement: Examination of the design of the high-risk AI system.
Implementation:
- Design review gates at key development milestones (architecture, implementation, pre-deployment)
- Validation test plans covering functional requirements and AI Act performance obligations (Art.15)
- Traceability matrix: requirements → implementation → test coverage
- EU formal methods angle: Frama-C, Isabelle, Astrée verification artefacts satisfy Art.17(1)(c) at the highest assurance level
Documentation artefacts:
- Design review records (who reviewed, when, findings, sign-off)
- Test plans and test results (linked to Art.15 accuracy declarations)
- Verification and validation summary report
Element (d) — Resource and Competence Management
Requirement: Examination and testing of the high-risk AI system and related procedures, including the examination of technical solutions adopted to ensure compliance with this Regulation.
Practical interpretation (Recital 74 context): Art.17(1)(d) is understood to require resource allocation and competence management — ensuring that persons responsible for compliance have the knowledge, training, and tools to discharge their obligations.
Implementation:
- Competence matrix: roles × AI Act knowledge requirements
- Training records for compliance-related staff
- Tool inventory: what compliance tooling (logging frameworks, testing tools) is in place
- Supplier competence: contractual AI Act obligations flowing down to data suppliers and sub-processors
Element (e) — Post-Market Monitoring Plan
Requirement: A post-market monitoring system, as referred to in Article 72.
Art.17 × Art.72 relationship: Art.72 requires an active post-market monitoring plan. Art.17(1)(e) integrates this into the QMS so that PMM is not an afterthought but a governed, funded, and staffed programme.
Implementation:
- PMM plan document (deployed users, feedback channels, monitoring frequency)
- Serious incident reporting procedure (Art.73 timelines: 15 days for death/serious harm, 3 months for other serious incidents)
- Performance drift detection: statistical monitoring of model accuracy against Art.15 declared levels
- Feedback loop: PMM findings → Art.9 risk register → Art.17(1)(h) recalibration
from dataclasses import dataclass, field
from datetime import datetime, timedelta
from enum import Enum
from typing import Optional
class IncidentSeverity(Enum):
SERIOUS_HARM = "serious_harm" # Art.73: 15-day report
DEATH = "death" # Art.73: 15-day report (immediate)
OTHER_SERIOUS = "other_serious" # Art.73: 3-month report
MINOR = "minor" # Internal tracking only
@dataclass
class PostMarketIncident:
incident_id: str
system_id: str
detected_at: datetime
severity: IncidentSeverity
description: str
affected_persons: int
reported_to_authority: bool = False
authority_report_deadline: Optional[datetime] = None
def __post_init__(self):
if self.severity in (IncidentSeverity.SERIOUS_HARM, IncidentSeverity.DEATH):
self.authority_report_deadline = self.detected_at + timedelta(days=15)
elif self.severity == IncidentSeverity.OTHER_SERIOUS:
self.authority_report_deadline = self.detected_at + timedelta(days=90)
@dataclass
class PostMarketMonitoringPlan:
system_id: str
monitoring_frequency_days: int # How often metrics are reviewed
accuracy_baseline: float # Art.15 declared accuracy level
accuracy_drift_threshold: float = 0.05 # 5% drop triggers recalibration
incidents: list[PostMarketIncident] = field(default_factory=list)
def check_reporting_deadlines(self) -> list[PostMarketIncident]:
"""Returns incidents with overdue or imminent reporting deadlines."""
overdue = []
now = datetime.utcnow()
for incident in self.incidents:
if (incident.authority_report_deadline and
not incident.reported_to_authority and
now >= incident.authority_report_deadline - timedelta(days=2)):
overdue.append(incident)
return overdue
Element (f) — Communication Procedures
Requirement: Communication procedures to inform the competent national authorities and other relevant bodies in case of a serious incident or noncompliance.
Implementation:
- Incident communication SOP: who contacts which authority, within what timeframe
- Internal escalation chain: development → compliance officer → legal → CEO (for material incidents)
- Authority contact registry: relevant national market surveillance authority (MSA) per Member State of operation
- Art.73 reporting templates: structured reports meeting MSA requirements
- Art.86 right to explanation: procedure for responding to individual explanation requests
Dual reporting complexity (Art.17 × Art.73 × NIS2 Art.21):
For AI systems deployed in critical infrastructure, a single incident may trigger:
- AI Act Art.73: 15 days to MSA (serious incident)
- NIS2 Art.21: 24-hour early warning + 72-hour incident report to CSIRT/NCA
- GDPR Art.33: 72 hours to DPA (if personal data involved)
Art.17(1)(f) communication procedure must cover all three simultaneously.
Element (g) — Incident Reporting System
Requirement: Incident recording and reporting procedures.
Implementation:
- Incident register: structured log of all incidents, near-misses, and deviations
- Art.12-compliant logging: incident log entries must satisfy Art.12 technical requirements (system_id, operator_id, event_type, timestamp_utc, input_hash, output_reference)
- Root cause analysis (RCA) procedure
- Corrective and preventive action (CAPA) tracking
from dataclasses import dataclass, field
from datetime import datetime
from hashlib import sha256
from enum import Enum
from typing import Optional
import json
class Art17EventType(Enum):
PERFORMANCE_DEGRADATION = "performance_degradation"
HUMAN_OVERSIGHT_OVERRIDE = "human_oversight_override" # Art.14
SERIOUS_INCIDENT = "serious_incident" # Art.73
NEAR_MISS = "near_miss"
NONCOMPLIANCE_DETECTED = "noncompliance_detected"
CORRECTIVE_ACTION = "corrective_action"
RECALIBRATION = "recalibration" # Art.17(1)(h)
@dataclass
class Art17AuditTrail:
"""Art.17(1)(g) incident register entry — Art.12-compliant."""
system_id: str
operator_id: str
event_type: Art17EventType
description: str
input_data_sample: Optional[str] = None # Hashed per Art.12 GDPR intersection
output_reference: Optional[str] = None
rca_summary: Optional[str] = None
capa_action: Optional[str] = None
timestamp_utc: datetime = field(default_factory=datetime.utcnow)
def to_log_entry(self) -> dict:
"""Produces Art.12-compliant log record."""
input_hash = (
sha256(self.input_data_sample.encode()).hexdigest()
if self.input_data_sample else None
)
return {
"system_id": self.system_id,
"operator_id": self.operator_id,
"event_type": self.event_type.value,
"timestamp_utc": self.timestamp_utc.isoformat() + "Z",
"input_hash": input_hash,
"output_reference": self.output_reference,
"description": self.description,
"rca_summary": self.rca_summary,
"capa_action": self.capa_action,
}
def compute_chain_hash(self, previous_hash: str) -> str:
"""Tamper-evident hash chain for Art.12(3) append-only requirement."""
entry = json.dumps(self.to_log_entry(), sort_keys=True)
combined = previous_hash + entry
return sha256(combined.encode()).hexdigest()
Element (h) — Recalibration Procedure
Requirement: Procedures for the recalibration, updating and continuous improvement of the high-risk AI system.
Implementation:
- Recalibration trigger conditions: performance drift below Art.15 baseline, new risk information from PMM, regulatory guidance updates
- Change classification: minor update (no re-assessment) vs substantial modification (re-assessment required per Art.43)
- Validation gate before re-deployment: recalibrated model must pass Art.15 accuracy and robustness tests
- Lifecycle documentation: recalibration events are Art.11 technical documentation entries
Art.17(1)(h) × Art.43 modification trigger:
Not every recalibration triggers a new conformity assessment. The threshold is a "substantial modification" under Art.6(3) — a change that affects compliance with essential requirements. Art.17(1)(h) procedure must operationalise this distinction:
| Modification Type | Art.17 Action | Art.43 Trigger |
|---|---|---|
| Hyperparameter tuning (same architecture) | Document + validate | No |
| New training data (same distribution) | Document + validate | No |
| New training data (significantly different distribution) | Full Art.9 re-assessment | Yes |
| Architecture change | Full technical documentation update | Yes |
| New intended purpose | New high-risk classification check | Yes |
| Security patch (no functional change) | Document + log | No |
ISO/IEC 42001 — The EU-Endorsed AI QMS Standard
ISO/IEC 42001:2023 is the international standard for AI Management Systems and is specifically referenced in EU AI Act recitals as a standard that can demonstrate Art.17 compliance.
ISO/IEC 42001 × Art.17 mapping:
| ISO/IEC 42001 Clause | Art.17 Element | Description |
|---|---|---|
| Clause 4 (Context) | (a) Compliance strategy | Organisational context and stakeholder requirements |
| Clause 5 (Leadership) | (a), (d) | Leadership commitment, responsible AI policy |
| Clause 6 (Planning) | (b) Risk plan | Risk and opportunity management, AI objectives |
| Clause 7 (Support) | (d) Resources | Resources, competence, awareness, communication |
| Clause 8 (Operation) | (b), (c), (e) | Operational planning, design control, PMM |
| Clause 9 (Performance evaluation) | (e), (g) | Monitoring, internal audit, management review |
| Clause 10 (Improvement) | (h) Recalibration | Nonconformity, corrective action, continual improvement |
| Annex A (Controls) | All elements | 38 AI-specific controls (data governance, transparency, etc.) |
Certification path: ISO/IEC 42001 certification by an accredited certification body can serve as evidence for Art.17 compliance in a conformity assessment under Art.43. For Annex III systems going through the Art.43(1) internal control path, ISO/IEC 42001 provides a structured evidence trail.
ISO 9001 Integration
Most enterprise providers already have an ISO 9001 Quality Management System. Art.17 does not require abandoning ISO 9001 — it requires extending it with AI-specific elements.
ISO 9001 × Art.17 gap analysis:
| ISO 9001 Element | Art.17 Coverage | Gap |
|---|---|---|
| Clause 6.1 (Risk management) | (b) partial | ISO 9001 lacks AI-specific risk taxonomy (foreseeable misuse, fundamental rights impact) |
| Clause 7.5 (Documented information) | (a), (c), (g) partial | No AI documentation requirements (Annex IV) |
| Clause 8.3 (Design development) | (c) partial | No AI validation requirements (Art.15 accuracy/robustness) |
| Clause 9.1 (Monitoring) | (e) partial | No PMM plan format or incident reporting timelines |
| Clause 10.2 (Nonconformity) | (g), (h) partial | No Art.73 serious incident reporting procedure |
Integration approach: Extend ISO 9001 Clause 6.1 with Art.9 risk taxonomy, add AI-specific design review gates to Clause 8.3, extend Clause 9.1 monitoring with PMM plan, and add Art.73 reporting procedure to Clause 10.2. Document this as an "AI Act Extension" to the existing QMS scope statement.
SMB Proportionality
Art.17(2) explicitly addresses proportionality for SMBs:
"The quality management system shall take into account the size of the provider's organisation and the nature of its high-risk AI systems. This shall be without prejudice to the level of protection required for the AI systems concerned."
Proportionality in practice:
| Provider Size | QMS Approach | Documentation Minimum |
|---|---|---|
| Solo developer / micro-enterprise | Lightweight documented procedures; one person can hold multiple roles | 8-element checklist + incident log |
| SMB (10-250 persons) | Dedicated compliance role; integrated into existing project management | Full Art.17 procedure documentation + ISO/IEC 42001 self-assessment |
| Large enterprise | Formal AI governance function; ISO/IEC 42001 certification | Full ISO/IEC 42001 certification + conformity assessment body audit |
What proportionality does NOT permit:
- Omitting any of the 8 mandatory elements (all 8 are required regardless of size)
- Informal/verbal QMS (written documentation required by Art.17(1))
- Shorter retention periods than Art.11(3) mandates (10 years)
QMS Documentation and the 10-Year Retention Requirement
Art.11(3) requires QMS documentation to be retained for 10 years after the high-risk AI system is placed on the market or put into service. This retention obligation directly intersects with CLOUD Act jurisdiction.
CLOUD Act × Art.17 documentation risk:
If QMS documentation (compliance strategy, risk plans, incident records, recalibration logs) is stored on a US-headquartered cloud provider, it is potentially subject to CLOUD Act compelled disclosure to US authorities — regardless of where the data physically resides.
Art.17 documentation categories subject to 10-year retention:
- Compliance strategy documents
- Risk management plans and risk assessment records
- Design verification and validation reports
- Training and competence records
- Post-market monitoring plans and incident registers
- Communication records with competent authorities
- Recalibration decisions and change records
EU-native storage argument: Storing 10 years of QMS documentation on an EU-headquartered provider (operating exclusively under GDPR and AI Act jurisdiction, without CLOUD Act exposure) eliminates the dual-compellability risk and satisfies Art.17 documentation requirements under a single regulatory regime.
QMS × Post-Market Monitoring (Art.72) Deep Dive
Art.17(1)(e) explicitly includes Art.72 PMM as a QMS element, creating a direct governance link.
Art.72 PMM obligations that Art.17 QMS must govern:
| Art.72 Requirement | Art.17 QMS Governance |
|---|---|
| Active PMM system proportionate to risk | PMM plan document with monitoring frequency and scope |
| Continuous performance monitoring | Metrics collection procedure + accuracy drift monitoring |
| Reporting to distributors and importers | Communication SOP (Art.17(1)(f)) |
| Serious incident detection | Incident register (Art.17(1)(g)) + reporting thresholds |
| Corrective action upon PMM finding | Recalibration procedure (Art.17(1)(h)) |
from dataclasses import dataclass, field
from datetime import datetime
from typing import Optional
from enum import Enum
class QMSDocumentStatus(Enum):
DRAFT = "draft"
UNDER_REVIEW = "under_review"
APPROVED = "approved"
SUPERSEDED = "superseded"
@dataclass
class QMSDocumentRegister:
"""Art.17 QMS Document Register — tracks all 8-element documentation."""
document_id: str
title: str
art17_element: str # "a" through "h"
version: str
status: QMSDocumentStatus
owner: str
created_at: datetime
last_reviewed: datetime
next_review_due: datetime
storage_location: str # Should be EU-jurisdictioned storage
retention_until: Optional[datetime] = None # Art.11(3): 10 years from market placement
def is_eu_jurisdictioned(self) -> bool:
"""Checks if storage location is in EU-native infrastructure."""
eu_indicators = ["sota.io", "eu-west", "eu-central", "de-", "fr-", "nl-"]
return any(indicator in self.storage_location.lower() for indicator in eu_indicators)
def is_retention_compliant(self, system_market_date: datetime) -> bool:
"""Art.11(3): 10-year retention from market placement date."""
required_retention = system_market_date.replace(
year=system_market_date.year + 10
)
if self.retention_until is None:
return False
return self.retention_until >= required_retention
class QMSComplianceChecker:
"""Art.17 QMS compliance checker — validates all 8 mandatory elements."""
REQUIRED_ELEMENTS = {
"a": "Compliance strategy including modification management",
"b": "Risk management plan and design examination procedures",
"c": "Design verification and validation procedures",
"d": "Resource and competence management procedures",
"e": "Post-market monitoring plan (Art.72)",
"f": "Communication procedures for incidents and noncompliance",
"g": "Incident recording and reporting system",
"h": "Recalibration, update, and improvement procedures",
}
def __init__(self, document_register: list[QMSDocumentRegister]):
self.documents = document_register
def check_element_coverage(self) -> dict[str, bool]:
"""Returns coverage status for each of the 8 mandatory elements."""
covered = {element: False for element in self.REQUIRED_ELEMENTS}
for doc in self.documents:
if (doc.art17_element in covered and
doc.status == QMSDocumentStatus.APPROVED):
covered[doc.art17_element] = True
return covered
def check_documentation_completeness(self) -> dict:
"""Full Art.17 compliance audit result."""
element_coverage = self.check_element_coverage()
missing_elements = [
f"({e}) {desc}"
for e, desc in self.REQUIRED_ELEMENTS.items()
if not element_coverage[e]
]
eu_docs = sum(1 for d in self.documents if d.is_eu_jurisdictioned())
return {
"elements_covered": sum(element_coverage.values()),
"elements_required": 8,
"missing_elements": missing_elements,
"is_compliant": len(missing_elements) == 0,
"eu_jurisdictioned_documents": eu_docs,
"total_documents": len(self.documents),
"cloud_act_risk": eu_docs < len(self.documents),
}
def generate_compliance_report(self) -> str:
"""Produces a human-readable Art.17 compliance status report."""
result = self.check_documentation_completeness()
lines = [
f"Art.17 QMS Compliance Report",
f"Elements covered: {result['elements_covered']}/8",
f"Overall compliant: {result['is_compliant']}",
]
if result['missing_elements']:
lines.append(f"Missing elements:")
for m in result['missing_elements']:
lines.append(f" - {m}")
if result['cloud_act_risk']:
lines.append(
f"WARNING: {result['total_documents'] - result['eu_jurisdictioned_documents']} "
f"document(s) stored outside EU jurisdiction (CLOUD Act risk)"
)
return "\n".join(lines)
Art.17 Compliance Checklist (40 Items)
Scope and Organisation (10 items)
- ☐ Written QMS policy document exists and is approved by senior management
- ☐ QMS scope statement identifies all in-scope high-risk AI systems
- ☐ QMS owner (responsible AI officer or equivalent) is designated
- ☐ QMS is documented "in systematic and orderly manner" (not ad-hoc files)
- ☐ QMS covers all Annex III categories the organisation operates in
- ☐ SMB proportionality assessment documented where applicable
- ☐ Third-country providers: QMS covers EU obligations under Art.2(1)(c)
- ☐ Authorised representative (Art.22) notified of QMS existence
- ☐ QMS version control in place (document management system)
- ☐ QMS review cycle defined (minimum annual review recommended)
Element (a) Compliance Strategy (5 items)
- ☐ Compliance strategy document covers all applicable Annex III obligations
- ☐ Change management procedure defines "substantial modification" triggers for Art.43 re-assessment
- ☐ Role assignments for compliance decisions documented
- ☐ Hosting jurisdiction strategy documented (EU-native = single regime)
- ☐ GPAI model obligations addressed if organisation is a GPAI provider (Art.51-56)
Element (b) Risk Management (5 items)
- ☐ Risk management procedure aligned with Art.9 requirements
- ☐ Design examination procedure specifies who reviews architectural decisions
- ☐ Risk assessment schedule defined (frequency, triggers for ad-hoc assessment)
- ☐ Fundamental rights impact assessment procedure exists
- ☐ Risk acceptance criteria and escalation thresholds documented
Element (c) Design Verification (4 items)
- ☐ Design review gates at defined development milestones
- ☐ Validation test plans cover Art.15 accuracy and robustness requirements
- ☐ Traceability matrix: requirements → implementation → test evidence
- ☐ Verification artefacts (test reports, formal proofs) stored and version-controlled
Element (d) Resource Management (3 items)
- ☐ Competence matrix: roles × AI Act knowledge requirements
- ☐ Training records for compliance-relevant staff maintained
- ☐ Supplier/sub-processor AI Act obligations documented contractually
Element (e) Post-Market Monitoring (4 items)
- ☐ PMM plan document aligned with Art.72 requirements
- ☐ Performance drift monitoring defined (accuracy vs Art.15 baseline)
- ☐ Feedback channels from deployers and end-users documented
- ☐ PMM findings feed into Art.9 risk register (documented loop)
Element (f)-(g) Communication and Incident Reporting (5 items)
- ☐ Art.73 reporting procedure with timelines (15 days/3 months) documented
- ☐ Internal escalation chain for serious incidents documented
- ☐ National MSA contact registry maintained per Member State of operation
- ☐ Incident register (Art.12-compliant) in place and operational
- ☐ CAPA procedure documented and linked to incident register
Element (h) Recalibration (2 items)
- ☐ Recalibration trigger conditions documented (performance drift thresholds)
- ☐ Validation gate procedure before re-deployment after recalibration
Documentation Retention (2 items)
- ☐ QMS documentation retention period set to minimum 10 years (Art.11(3))
- ☐ Documentation stored in EU-jurisdictioned infrastructure (CLOUD Act risk mitigation)
Art.17 × Cross-Article Compliance Matrix
| Compliance Action | Art.17 Element | Supporting Article |
|---|---|---|
| Compliance strategy | (a) | Art.43 (conformity assessment), Art.6 (modification triggers) |
| Risk management plan | (b) | Art.9 (risk management system) |
| Design verification | (c) | Art.15 (accuracy/robustness), Art.10 (training data) |
| Resource management | (d) | Art.25 (deployer obligations), Art.28 (importer) |
| Post-market monitoring | (e) | Art.72 (PMM), Art.73 (serious incident reporting) |
| Communication procedures | (f) | Art.73, Art.86 (right to explanation), NIS2 Art.21 |
| Incident recording | (g) | Art.12 (logging), Art.73, GDPR Art.33 |
| Recalibration | (h) | Art.15 (ongoing accuracy obligation), Art.43 (re-assessment) |
| QMS documentation | All | Art.11 Annex IV (technical documentation), Art.11(3) (10yr retention) |
| EU-native hosting | All | CLOUD Act risk mitigation, GDPR Art.46 |
Enforcement Exposure
Art.17 violations are sanctionable under Art.99 of the AI Act:
- Art.99(3): Non-compliance with obligations other than prohibited practices: up to €15 million or 3% of global annual turnover (whichever is higher)
- Art.99(4): Supply of incorrect or misleading information to notified bodies or authorities: up to €7.5 million or 1% of global annual turnover
Notably, Art.17 non-compliance is a Category II violation (€15M/3%) — the same tier as accuracy obligations (Art.15) and logging obligations (Art.12). This reflects the EU legislator's view that process governance failures are as serious as technical compliance failures.
Enforcement pattern: Market surveillance authorities conducting conformity assessments under Art.74 will typically request the Art.17 QMS documentation as primary evidence. A missing or incomplete QMS is likely to trigger a full investigation regardless of whether the AI system itself is technically compliant.
What to Do Now
If you're a high-risk AI provider (August 2026 deadline):
- Gap assessment: Map your existing QMS (or project management processes) against the 8 mandatory Art.17 elements. Use the 40-item checklist above.
- ISO/IEC 42001 alignment: Download the standard and map Art.17 requirements to the clause structure. Annex A controls provide 38 AI-specific implementation patterns.
- Document the 8 elements: Even if processes exist informally, Art.17 requires written documentation. Start with elements (a) compliance strategy and (g) incident register — highest enforcement priority.
- Jurisdiction audit: Inventory where QMS documentation is stored. If any artefacts are on US-headquartered providers, assess CLOUD Act risk under the 10-year retention requirement.
- PMM plan: Draft your Art.72 post-market monitoring plan and integrate it into element (e) of the QMS.
If you're a deployer (Art.25 obligations):
Deployers have separate Art.17-adjacent obligations — specifically, they must implement the Art.14 human oversight measures and the Art.13 instructions for use. If you are also a provider (e.g., you fine-tune a foundation model for a specific Annex III use case), Art.17 applies to you as provider.
See Also
- EU AI Act Art.9 Formal Verification: Risk Management for High-Risk AI
- EU AI Act Art.11 Technical Documentation: Annex IV Deep Dive
- EU AI Act Art.12 Logging & Record-Keeping: Developer Guide
- EU AI Act Art.15 Accuracy, Robustness & Cybersecurity
- EU AI Act 2026 Conformity Assessment & PaaS Developer Guide