EU AI Act Art.100: Penalties for Union Institutions — EDPS Enforcement, Fine Structure, and Procurement Developer Guide (2026)
The EU AI Act establishes three distinct enforcement tracks for administrative penalties: Article 99 for private-sector operators supervised by national market surveillance authorities, Article 101 for GPAI model providers supervised by the AI Office, and Article 100 for EU institutions, bodies, offices, and agencies supervised by the European Data Protection Supervisor (EDPS).
Art.100 matters for two categories of stakeholders. First, it directly affects the 50+ EU institutions and agencies — the Commission, Parliament, Council, Court of Justice, EMA, ECB, FRONTEX, Europol, Eurojust, and dozens more — that operate or deploy AI systems in their daily functions. Second, it structurally affects private-sector AI vendors and developers who sell or deploy AI systems to those institutions, because the EU institution is the deployer subject to EDPS supervision and the vendor's technical choices determine whether that deployer can demonstrate compliance.
This guide covers Art.100's enforcement architecture, fine structure, EDPS powers, procurement implications for AI vendors, the interaction with Art.110's transitional provisions, and how CLOUD Act exposure arises even in purely EU-institutional AI deployments.
What Article 100 Actually Says
Article 100 is structurally concise because it primarily allocates enforcement competence rather than creating new substantive obligations. The substantive obligations (high-risk AI requirements, transparency, human oversight, etc.) are the same as for private operators — the difference is who enforces them.
Art.100(1) — EDPS Competence:
The European Data Protection Supervisor is designated as the competent authority for supervising EU institutions, bodies, offices, and agencies in their capacity as operators of AI systems covered by the AI Act. This mirrors the EDPS's existing competence under Regulation (EU) 2018/1725 (the EU's "GDPR equivalent" for Union institutions) — the AI Act simply extends that mandate to AI supervision.
The EDPS competence covers EU institutions acting as:
- Deployers of high-risk AI systems (the most common role)
- Providers of AI systems developed internally or commissioned exclusively for EU institution use
- Operators of prohibited AI practices under Art.5
Where an EU institution procures an off-the-shelf AI system from a private-sector provider, that private provider remains subject to Art.99 NCA enforcement for provider obligations. The EU institution deployer is subject to Art.100 EDPS enforcement for deployer obligations.
Art.100(2) — Fine Structure:
The fine tiers mirror Art.99 exactly:
| Tier | Violation | Maximum Fine |
|---|---|---|
| Tier 1 | Art.5 prohibited practices | €35,000,000 or 7% of total annual budget |
| Tier 2 | Other AI Act obligations | €15,000,000 or 3% of total annual budget |
| Tier 3 | Misleading EDPS | €7,500,000 or 1.5% of total annual budget |
The turnover calculation shifts from "worldwide annual turnover" (Art.99) to "total annual budget" — the relevant financial base for institutions that operate on budgetary appropriations rather than commercial revenue. For context, the European Commission's 2026 operating budget exceeds €180 billion; the "7% of total annual budget" construct means that, at least in principle, the fine ceiling for a major violation could be enormous, though EDPS enforcement practice will likely apply the absolute cap in most cases.
Art.100(3) — EDPS Cooperation:
The EDPS must cooperate with the AI Board under Art.65 and with national competent authorities in cases involving AI systems that EU institutions use jointly with private operators or that have cross-institutional implications. This cooperation mechanism prevents regulatory gaps when an AI system spans both EU-institutional and private-sector deployment.
Art.100(4) — CJEU Jurisdiction:
EDPS decisions imposing fines under Art.100 are subject to review by the Court of Justice of the EU — the same jurisdictional framework as other EDPS enforcement actions. Unlike Art.99 fines (which go through national administrative and judicial procedures with country-by-country variation), Art.100 fines have a single, unified appellate path.
The Art.100 vs Art.99 vs Art.101 Enforcement Triangle
Understanding which article applies requires mapping the operator type and AI system category:
┌─────────────────────────────────────────────────────────────────┐
│ WHO IS BEING FINED? │
├─────────────────────────────────────────────────────────────────┤
│ Private-sector operator (deployer or provider) → Art.99 NCA │
│ EU institution / body / office / agency → Art.100 EDPS │
│ GPAI model provider (any entity) → Art.101 AI Office │
└─────────────────────────────────────────────────────────────────┘
The practical consequences of this split:
For private AI providers selling to EU institutions: The provider's obligations (Art.9 risk management, Art.11 technical documentation, Art.13 transparency, Art.17 QMS) are enforced by the NCA of the provider's establishment (Art.99). The EU institution deployer's obligations (Art.26 deployer duties, Art.29 fundamental rights impact assessment for Annex III systems, Art.61 information to providers) are enforced by the EDPS (Art.100).
For GPAI model providers whose models EU institutions use: The GPAI provider faces Art.101 AI Office enforcement for Chapter V violations regardless of the downstream deployer's identity. If the EU institution deploys a GPAI-based system as a high-risk system, the institution faces EDPS enforcement for Annex III deployer obligations, while the GPAI foundation model provider faces AI Office enforcement for Art.53–55 obligations.
For EU institutions building AI internally: If an EU institution builds and deploys an AI system for internal or inter-institutional use without placing it on the market, the institution is both provider and deployer — all obligations (both provider and deployer tracks) are enforceable by the EDPS alone.
Which EU Institutions Are Covered
Art.100 applies to all Union institutions, bodies, offices, and agencies within the meaning of the Treaties and EU law. The primary entities include:
Core Treaty Institutions:
- European Commission (including Directorate-Generals and Executive Agencies)
- European Parliament
- Council of the European Union
- Court of Justice of the EU (CJEU)
- European Central Bank (ECB) — has significant AI use in supervisory functions
- European Court of Auditors
- European External Action Service (EEAS)
Regulatory and Supervisory Agencies (EU Agencies):
- European Banking Authority (EBA) — regulates AI in banking, also an Art.100 subject
- European Securities and Markets Authority (ESMA) — AI use in market surveillance
- European Medicines Agency (EMA) — AI in regulatory review and pharmacovigilance
- European Aviation Safety Agency (EASA) — AI in airworthiness certification
- European Agency for Fundamental Rights (FRA)
- Europol — AI in law enforcement analytics (intersection with Art.5 biometric rules)
- Eurojust
- FRONTEX — AI in border management (intersection with Art.5 real-time RBI rules)
- European Data Protection Board (EDPB) — subject to EDPS on AI matters
- EUIPO, EFSA, ERA, ACER, and 40+ other agencies
Interinstitutional Bodies:
- Publications Office
- European Personnel Selection Office (EPSO) — AI in recruitment processes
- European School of Administration (EUSA)
The breadth is important: agencies that are themselves regulators of AI or other technologies are also subjects of Art.100 enforcement when they deploy AI in their own operations.
EDPS Enforcement: Powers and Procedure
The EDPS's AI Act enforcement powers under Art.100 parallel the NCA powers under Art.58 and Art.64, adapted for the interinstitutional context:
Investigation Powers:
- Access to AI system technical documentation, logs, and source code
- On-site inspections of EU institution premises
- Information requests to institution staff and contractors
- Access to training data documentation and quality management records
Corrective Measures:
- Orders to bring AI systems into compliance within specified timeframes
- Temporary or permanent restrictions on AI system use
- Orders to withdraw AI systems from service
- Imposition of administrative fines up to the Art.100(2) tiers
- Publication of enforcement decisions (reputational consequences)
Procedural Protections:
- Right to be heard before any enforcement decision
- Access to EDPS investigation file (subject to confidentiality limitations)
- CJEU review of all EDPS decisions imposing fines or corrective measures
EDPS Enforcement Style in Practice:
The EDPS has historically taken a more advisory and less aggressive enforcement posture compared to some national DPAs — focusing on guidance, recommendations, and prior consultation rather than fines. However, the AI Act creates formal fine-imposition powers that are new for the EDPS, and EU institutions — particularly those deploying high-risk AI systems affecting individuals — should not assume EDPS enforcement will remain light-touch indefinitely.
The EDPS issued its opinion on the AI Act proposal in 2021, advocating for stronger restrictions, and has been actively building AI supervisory capacity since the regulation's entry into force.
EU Procurement Implications for AI Vendors
The single most important Art.100 consequence for private-sector AI developers is that EU institutions deploying your AI system are subject to EDPS enforcement, not NCA enforcement — and that shifts what they need from you contractually and technically.
What EU Institution Deployers Need From AI Providers
EU institutions acting as deployers of high-risk AI systems under Annex III must fulfill Art.26 deployer obligations. Meeting those obligations requires specific technical deliverables from the provider:
Art.26(1) — Appropriate use: EU institutions must use your AI system only as specified in the instructions for use. Your instructions for use documentation must be precise, complete, and scoped to the permitted deployment contexts.
Art.26(2) — Human oversight: EU institutions must implement the human oversight measures you specified in technical documentation. Your system must make those oversight measures technically implementable — not just documented on paper.
Art.26(3) — Monitoring: EU institutions must monitor AI system performance, detect anomalies, and report serious incidents (Art.73) and near-misses. Your system needs logging and monitoring interfaces that the institution can use.
Art.26(7) — Fundamental Rights Impact Assessment (FRIA): EU institutions deploying Annex III Category I–VII high-risk systems must conduct a FRIA before deployment. Your technical documentation must provide enough information about data used, potential biases, and known limitations to enable a meaningful FRIA.
Art.26(9) — Record of use: EU institutions must keep records of AI system deployment, including decisions made with AI assistance. Your system must be designed to produce auditable logs at the required level of detail.
Contractual Architecture for EU Institution AI Procurement
The EDPS is unlikely to accept "the vendor didn't give us what we needed" as a defense in an enforcement proceeding. EU institutions will increasingly require AI vendors to contractually guarantee:
- Technical documentation completeness: Conformance with Annex IV (and Annex XI for GPAI components)
- Ongoing documentation updates: Provider obligations to deliver updated documentation after significant changes
- EDPS audit support: Cooperation obligations if the EDPS initiates an investigation into the deployed system
- Incident notification: Provider-to-deployer notification procedures that meet Art.73 timelines
- FRIA information package: Pre-structured information sets enabling the institution to conduct its Art.26(7) fundamental rights impact assessment
AI vendors targeting the EU institutional market should build these deliverables into their standard enterprise documentation packages — institutions that cannot demonstrate compliance will not be able to renew contracts.
Art.100 and the Art.110 Transitional Period
Art.110 grants EU institutions, bodies, offices, and agencies a 36-month transition period from the date of full AI Act application — meaning until August 2, 2029 — to bring their existing AI systems into compliance. This transition applies to AI systems already in use before the regulation's full application date of August 2, 2026.
The Art.110 transition is more generous than the Art.108 transition for private operators, which provides only a 2-year grace period for most systems. The additional year for EU institutions reflects the complexity of interinstitutional procurement processes and the need for Interinstitutional AI Committee coordination.
What the Transition Does and Does Not Cover:
The Art.110 transition delays compliance obligations for existing systems — it does not exempt EU institutions from:
- Art.5 prohibited practice prohibitions (in force February 2, 2025)
- EDPS oversight and investigative powers during the transition period
- Documentation obligations for new AI system procurements and deployments after August 2, 2026
If an EU institution initiates a new AI procurement after August 2, 2026, the Art.110 transition does not apply — the procured system must comply with all applicable requirements from day one.
Substantial Modification During the Transition:
Like the private-sector Art.108 transition, the Art.110 transition ends when an AI system undergoes "substantial modification" — a significant change to design, purpose, or behavior that goes beyond ordinary maintenance and updates. EU institutions need clear internal governance to distinguish maintenance from substantial modification during the 2026–2029 transition period.
CLOUD Act Exposure in EU Institutional AI Deployments
EU institutions themselves are EU legal entities operating under EU law — they are not directly subject to US jurisdiction claims under the CLOUD Act. However, the infrastructure contractors, cloud providers, and AI platform vendors serving EU institutions frequently are.
The Contractor Exposure Chain:
When an EU institution deploys an AI system hosted on infrastructure provided by a US-incorporated cloud vendor (AWS, Azure, Google Cloud), that vendor is subject to US government access requests under the CLOUD Act for data processed on their infrastructure — including logs, model outputs, training data derivatives, and system documentation generated during EU institution AI deployments.
The EDPS has long taken the position that EU institutions must ensure their data processing meets the data protection requirements of Regulation 2018/1725. AI Act compliance adds a parallel dimension: can an EU institution demonstrate Art.11 technical documentation integrity and Art.12 logging accuracy when the underlying infrastructure is subject to undisclosed US government access requests?
The EU-Native Infrastructure Advantage for EU Institution Procurement:
For AI systems processing sensitive data — Europol law enforcement data, EMA clinical trial information, FRONTEX border management data, ECB supervisory data — EU institutions face heightened scrutiny on infrastructure sovereignty. AI vendors who can demonstrate EU-incorporated infrastructure without US parent entities (and therefore without CLOUD Act exposure) have a compliance architecture advantage in EU institutional procurement:
- No foreign intelligence access risk to AI system logs and outputs
- No jurisdiction conflict between EDPS supervision requirements and US government access rights
- Simplified FRIA: fundamental rights impact assessments do not need to account for foreign surveillance exposure
- Stronger EDPS audit readiness: technical documentation integrity not subject to foreign-law interference
For AI vendors building EU institutional market strategies, EU-native deployment infrastructure is increasingly a procurement requirement, not just a differentiator.
Python Tooling: Art100ComplianceTracker
from dataclasses import dataclass, field
from enum import Enum
from typing import Optional
from datetime import date, timedelta
class InstitutionType(Enum):
TREATY_INSTITUTION = "treaty_institution"
EU_AGENCY = "eu_agency"
INTERINSTITUTIONAL_BODY = "interinstitutional_body"
class AISystemRole(Enum):
DEPLOYER = "deployer" # procured from external provider
PROVIDER_DEPLOYER = "provider_deployer" # built and deployed internally
PROVIDER_ONLY = "provider_only" # built for other institutions
class ComplianceStatus(Enum):
COMPLIANT = "compliant"
IN_PROGRESS = "in_progress"
NON_COMPLIANT = "non_compliant"
TRANSITION_PERIOD = "transition_period" # Art.110 grace
NOT_APPLICABLE = "not_applicable"
@dataclass
class Art100RiskAssessment:
institution_name: str
institution_type: InstitutionType
ai_system_name: str
role: AISystemRole
is_high_risk_annex_iii: bool
deployment_date: Optional[date]
is_prohibited_practice: bool = False
# Compliance dimensions
technical_documentation: ComplianceStatus = ComplianceStatus.IN_PROGRESS
human_oversight: ComplianceStatus = ComplianceStatus.IN_PROGRESS
fria_completed: ComplianceStatus = ComplianceStatus.IN_PROGRESS
logging_monitoring: ComplianceStatus = ComplianceStatus.IN_PROGRESS
eu_database_registered: ComplianceStatus = ComplianceStatus.IN_PROGRESS
incident_reporting_ready: ComplianceStatus = ComplianceStatus.IN_PROGRESS
cloud_jurisdiction: str = "EU-native"
ART110_TRANSITION_END = date(2029, 8, 2)
ART100_APPLICATION = date(2026, 8, 2)
def is_in_art110_transition(self) -> bool:
"""System deployed before Aug 2026 is in Art.110 transition period."""
if self.deployment_date is None:
return False
return (
self.deployment_date < self.ART100_APPLICATION
and date.today() < self.ART110_TRANSITION_END
)
def days_until_transition_end(self) -> int:
return (self.ART110_TRANSITION_END - date.today()).days
def max_fine_tier1(self, annual_budget_eur: float) -> float:
"""Art.100(2) Tier 1: prohibited practices — 7% of annual budget or €35M."""
return max(annual_budget_eur * 0.07, 35_000_000)
def max_fine_tier2(self, annual_budget_eur: float) -> float:
"""Art.100(2) Tier 2: other obligations — 3% of annual budget or €15M."""
return max(annual_budget_eur * 0.03, 15_000_000)
def max_fine_tier3(self, annual_budget_eur: float) -> float:
"""Art.100(2) Tier 3: misleading EDPS — 1.5% of annual budget or €7.5M."""
return max(annual_budget_eur * 0.015, 7_500_000)
def enforcement_authority(self) -> str:
return "EDPS (European Data Protection Supervisor)"
def appeals_jurisdiction(self) -> str:
return "Court of Justice of the EU (CJEU)"
def cloud_act_risk(self) -> str:
if "US" in self.cloud_jurisdiction or "us-" in self.cloud_jurisdiction.lower():
return "HIGH — infrastructure subject to US CLOUD Act access requests"
elif "EU-native" in self.cloud_jurisdiction:
return "LOW — EU-incorporated infrastructure, no CLOUD Act exposure"
return "MEDIUM — review cloud provider corporate structure"
def compliance_score(self) -> dict:
dimensions = [
self.technical_documentation,
self.human_oversight,
self.fria_completed,
self.logging_monitoring,
self.eu_database_registered,
self.incident_reporting_ready,
]
compliant = sum(1 for d in dimensions if d == ComplianceStatus.COMPLIANT)
return {
"score": f"{compliant}/{len(dimensions)}",
"percentage": round(compliant / len(dimensions) * 100),
"gaps": [d.value for d in dimensions if d != ComplianceStatus.COMPLIANT],
}
def generate_edps_readiness_report(self, annual_budget_eur: float) -> str:
score = self.compliance_score()
transition_note = ""
if self.is_in_art110_transition():
transition_note = (
f"\n⏳ Art.110 TRANSITION: {self.days_until_transition_end()} days "
f"remaining (ends {self.ART110_TRANSITION_END})"
)
return f"""
=== Art.100 EDPS Readiness Report: {self.institution_name} ===
AI System: {self.ai_system_name}
Role: {self.role.value}
Enforcement Authority: {self.enforcement_authority()}
Appeals: {self.appeals_jurisdiction()}
COMPLIANCE SCORE: {score['score']} ({score['percentage']}%)
Gaps: {', '.join(score['gaps']) if score['gaps'] else 'None'}
{transition_note}
FINE EXPOSURE (annual budget: €{annual_budget_eur:,.0f}):
Tier 1 (prohibited practices): up to €{self.max_fine_tier1(annual_budget_eur):,.0f}
Tier 2 (other obligations): up to €{self.max_fine_tier2(annual_budget_eur):,.0f}
Tier 3 (misleading EDPS): up to €{self.max_fine_tier3(annual_budget_eur):,.0f}
CLOUD ACT RISK: {self.cloud_act_risk()}
FRIA Required: {'YES (Annex III)' if self.is_high_risk_annex_iii else 'NO'}
"""
def assess_eu_institution_portfolio(
systems: list[Art100RiskAssessment],
annual_budget_eur: float,
) -> str:
"""Portfolio-level Art.100 readiness summary for an EU institution."""
high_risk = [s for s in systems if s.is_high_risk_annex_iii]
prohibited_risk = [s for s in systems if s.is_prohibited_practice]
transition = [s for s in systems if s.is_in_art110_transition()]
cloud_risk = [
s for s in systems
if s.cloud_act_risk().startswith("HIGH")
]
return f"""
=== EU Institution AI Portfolio: Art.100 Assessment ===
Total AI Systems: {len(systems)}
High-Risk (Annex III): {len(high_risk)}
Prohibited Practice Risk: {len(prohibited_risk)}
In Art.110 Transition: {len(transition)}
High CLOUD Act Risk: {len(cloud_risk)}
Annual Budget: €{annual_budget_eur:,.0f}
Max Institutional Fine Exposure:
Tier 1: €{max(annual_budget_eur * 0.07, 35_000_000):,.0f}
Tier 2: €{max(annual_budget_eur * 0.03, 15_000_000):,.0f}
Enforcement Authority: EDPS
Appeals: CJEU
"""
Art.100 vs Art.99: Key Operational Differences
| Dimension | Art.99 (Private Operators) | Art.100 (EU Institutions) |
|---|---|---|
| Competent authority | National Market Surveillance Authority | European Data Protection Supervisor (EDPS) |
| Fine basis | Worldwide annual turnover | Total annual budget |
| Appeal path | National courts → CJEU | CJEU directly |
| Transitional period | Art.108 (2 years for Annex III) | Art.110 (3 years = until 2029) |
| Supervisory style | Varies by Member State | Centralized, EDPS institutional culture |
| Interinstitutional coordination | Via AI Board | Via AI Board + Interinstitutional AI Committee |
| GPAI provider overlap | GPAI provider → Art.101 AI Office | GPAI deployer → Art.100 EDPS |
25-Item EU Institutional AI Compliance Checklist
EDPS Supervisory Readiness:
- 1. AI system inventory completed: all AI systems in use catalogued with Annex III classification status
- 2. High-risk systems registered in EU AI Database (Art.71) before deployment
- 3. Art.5 prohibited practice screening completed for all AI systems
- 4. Art.100 enforcement point of contact designated within institution
Art.26 Deployer Obligations (for externally procured systems):
- 5. Instructions for use reviewed and deployment scoped to intended purpose
- 6. Human oversight measures implemented as specified by provider documentation
- 7. Logging and monitoring infrastructure in place for high-risk systems
- 8. Fundamental Rights Impact Assessment (FRIA) completed for Annex III systems
- 9. Record of high-risk AI system use maintained (Art.26(9))
- 10. Deployer obligations for GPAI-based systems documented (Annex III + Chapter V)
Art.26(7) FRIA Documentation:
- 11. FRIA scope defined: fundamental rights potentially affected identified
- 12. Provider technical documentation sufficiency assessed for FRIA inputs
- 13. FRIA completed and signed off before system go-live
- 14. FRIA stored and available for EDPS review
Art.110 Transition Management:
- 15. Pre-existing systems (deployed before August 2, 2026) inventoried for Art.110 eligibility
- 16. Substantial modification policy adopted — triggers compliance-clock reset
- 17. Transition timeline tracked: Art.110 grace ends August 2, 2029
- 18. New procurements after August 2, 2026 identified as not eligible for Art.110 transition
Procurement and Vendor Management:
- 19. AI procurement contracts include technical documentation completeness guarantees
- 20. EDPS audit cooperation clauses included in vendor contracts
- 21. Incident notification SLAs from vendor meet Art.73 timelines (15 business days)
- 22. FRIA information package contractually required from AI providers
Infrastructure and CLOUD Act:
- 23. Cloud infrastructure jurisdiction assessed for each high-risk AI system
- 24. US-incorporated infrastructure providers identified — CLOUD Act exposure documented
- 25. EU-native infrastructure procurement preference policy in place for sensitive deployments
See Also
- EU AI Act Art.99: Administrative Fines — €35M/7% Three-Tier Structure for Operators
- EU AI Act Art.101: Administrative Fines for GPAI Providers — AI Office Enforcement
- EU AI Act Art.26: Deployer Obligations — Instructions for Use, Monitoring, Incident Reporting
- EU AI Act Art.58: NCA Powers — Market Surveillance, Investigation, and Sanctions
- EU AI Act Art.110: Transitional Provisions for Union Institutions — 36-Month Grace Period