2026-04-26·11 min read·sota.io team

EU AI Act Art.100: Penalties for Union Institutions — EDPS Enforcement, Fine Structure, and Procurement Developer Guide (2026)

The EU AI Act establishes three distinct enforcement tracks for administrative penalties: Article 99 for private-sector operators supervised by national market surveillance authorities, Article 101 for GPAI model providers supervised by the AI Office, and Article 100 for EU institutions, bodies, offices, and agencies supervised by the European Data Protection Supervisor (EDPS).

Art.100 matters for two categories of stakeholders. First, it directly affects the 50+ EU institutions and agencies — the Commission, Parliament, Council, Court of Justice, EMA, ECB, FRONTEX, Europol, Eurojust, and dozens more — that operate or deploy AI systems in their daily functions. Second, it structurally affects private-sector AI vendors and developers who sell or deploy AI systems to those institutions, because the EU institution is the deployer subject to EDPS supervision and the vendor's technical choices determine whether that deployer can demonstrate compliance.

This guide covers Art.100's enforcement architecture, fine structure, EDPS powers, procurement implications for AI vendors, the interaction with Art.110's transitional provisions, and how CLOUD Act exposure arises even in purely EU-institutional AI deployments.

What Article 100 Actually Says

Article 100 is structurally concise because it primarily allocates enforcement competence rather than creating new substantive obligations. The substantive obligations (high-risk AI requirements, transparency, human oversight, etc.) are the same as for private operators — the difference is who enforces them.

Art.100(1) — EDPS Competence:

The European Data Protection Supervisor is designated as the competent authority for supervising EU institutions, bodies, offices, and agencies in their capacity as operators of AI systems covered by the AI Act. This mirrors the EDPS's existing competence under Regulation (EU) 2018/1725 (the EU's "GDPR equivalent" for Union institutions) — the AI Act simply extends that mandate to AI supervision.

The EDPS competence covers EU institutions acting as:

Where an EU institution procures an off-the-shelf AI system from a private-sector provider, that private provider remains subject to Art.99 NCA enforcement for provider obligations. The EU institution deployer is subject to Art.100 EDPS enforcement for deployer obligations.

Art.100(2) — Fine Structure:

The fine tiers mirror Art.99 exactly:

TierViolationMaximum Fine
Tier 1Art.5 prohibited practices€35,000,000 or 7% of total annual budget
Tier 2Other AI Act obligations€15,000,000 or 3% of total annual budget
Tier 3Misleading EDPS€7,500,000 or 1.5% of total annual budget

The turnover calculation shifts from "worldwide annual turnover" (Art.99) to "total annual budget" — the relevant financial base for institutions that operate on budgetary appropriations rather than commercial revenue. For context, the European Commission's 2026 operating budget exceeds €180 billion; the "7% of total annual budget" construct means that, at least in principle, the fine ceiling for a major violation could be enormous, though EDPS enforcement practice will likely apply the absolute cap in most cases.

Art.100(3) — EDPS Cooperation:

The EDPS must cooperate with the AI Board under Art.65 and with national competent authorities in cases involving AI systems that EU institutions use jointly with private operators or that have cross-institutional implications. This cooperation mechanism prevents regulatory gaps when an AI system spans both EU-institutional and private-sector deployment.

Art.100(4) — CJEU Jurisdiction:

EDPS decisions imposing fines under Art.100 are subject to review by the Court of Justice of the EU — the same jurisdictional framework as other EDPS enforcement actions. Unlike Art.99 fines (which go through national administrative and judicial procedures with country-by-country variation), Art.100 fines have a single, unified appellate path.

The Art.100 vs Art.99 vs Art.101 Enforcement Triangle

Understanding which article applies requires mapping the operator type and AI system category:

┌─────────────────────────────────────────────────────────────────┐
│                    WHO IS BEING FINED?                          │
├─────────────────────────────────────────────────────────────────┤
│ Private-sector operator (deployer or provider) → Art.99 NCA    │
│ EU institution / body / office / agency → Art.100 EDPS         │
│ GPAI model provider (any entity) → Art.101 AI Office           │
└─────────────────────────────────────────────────────────────────┘

The practical consequences of this split:

For private AI providers selling to EU institutions: The provider's obligations (Art.9 risk management, Art.11 technical documentation, Art.13 transparency, Art.17 QMS) are enforced by the NCA of the provider's establishment (Art.99). The EU institution deployer's obligations (Art.26 deployer duties, Art.29 fundamental rights impact assessment for Annex III systems, Art.61 information to providers) are enforced by the EDPS (Art.100).

For GPAI model providers whose models EU institutions use: The GPAI provider faces Art.101 AI Office enforcement for Chapter V violations regardless of the downstream deployer's identity. If the EU institution deploys a GPAI-based system as a high-risk system, the institution faces EDPS enforcement for Annex III deployer obligations, while the GPAI foundation model provider faces AI Office enforcement for Art.53–55 obligations.

For EU institutions building AI internally: If an EU institution builds and deploys an AI system for internal or inter-institutional use without placing it on the market, the institution is both provider and deployer — all obligations (both provider and deployer tracks) are enforceable by the EDPS alone.

Which EU Institutions Are Covered

Art.100 applies to all Union institutions, bodies, offices, and agencies within the meaning of the Treaties and EU law. The primary entities include:

Core Treaty Institutions:

Regulatory and Supervisory Agencies (EU Agencies):

Interinstitutional Bodies:

The breadth is important: agencies that are themselves regulators of AI or other technologies are also subjects of Art.100 enforcement when they deploy AI in their own operations.

EDPS Enforcement: Powers and Procedure

The EDPS's AI Act enforcement powers under Art.100 parallel the NCA powers under Art.58 and Art.64, adapted for the interinstitutional context:

Investigation Powers:

Corrective Measures:

Procedural Protections:

EDPS Enforcement Style in Practice:

The EDPS has historically taken a more advisory and less aggressive enforcement posture compared to some national DPAs — focusing on guidance, recommendations, and prior consultation rather than fines. However, the AI Act creates formal fine-imposition powers that are new for the EDPS, and EU institutions — particularly those deploying high-risk AI systems affecting individuals — should not assume EDPS enforcement will remain light-touch indefinitely.

The EDPS issued its opinion on the AI Act proposal in 2021, advocating for stronger restrictions, and has been actively building AI supervisory capacity since the regulation's entry into force.

EU Procurement Implications for AI Vendors

The single most important Art.100 consequence for private-sector AI developers is that EU institutions deploying your AI system are subject to EDPS enforcement, not NCA enforcement — and that shifts what they need from you contractually and technically.

What EU Institution Deployers Need From AI Providers

EU institutions acting as deployers of high-risk AI systems under Annex III must fulfill Art.26 deployer obligations. Meeting those obligations requires specific technical deliverables from the provider:

Art.26(1) — Appropriate use: EU institutions must use your AI system only as specified in the instructions for use. Your instructions for use documentation must be precise, complete, and scoped to the permitted deployment contexts.

Art.26(2) — Human oversight: EU institutions must implement the human oversight measures you specified in technical documentation. Your system must make those oversight measures technically implementable — not just documented on paper.

Art.26(3) — Monitoring: EU institutions must monitor AI system performance, detect anomalies, and report serious incidents (Art.73) and near-misses. Your system needs logging and monitoring interfaces that the institution can use.

Art.26(7) — Fundamental Rights Impact Assessment (FRIA): EU institutions deploying Annex III Category I–VII high-risk systems must conduct a FRIA before deployment. Your technical documentation must provide enough information about data used, potential biases, and known limitations to enable a meaningful FRIA.

Art.26(9) — Record of use: EU institutions must keep records of AI system deployment, including decisions made with AI assistance. Your system must be designed to produce auditable logs at the required level of detail.

Contractual Architecture for EU Institution AI Procurement

The EDPS is unlikely to accept "the vendor didn't give us what we needed" as a defense in an enforcement proceeding. EU institutions will increasingly require AI vendors to contractually guarantee:

  1. Technical documentation completeness: Conformance with Annex IV (and Annex XI for GPAI components)
  2. Ongoing documentation updates: Provider obligations to deliver updated documentation after significant changes
  3. EDPS audit support: Cooperation obligations if the EDPS initiates an investigation into the deployed system
  4. Incident notification: Provider-to-deployer notification procedures that meet Art.73 timelines
  5. FRIA information package: Pre-structured information sets enabling the institution to conduct its Art.26(7) fundamental rights impact assessment

AI vendors targeting the EU institutional market should build these deliverables into their standard enterprise documentation packages — institutions that cannot demonstrate compliance will not be able to renew contracts.

Art.100 and the Art.110 Transitional Period

Art.110 grants EU institutions, bodies, offices, and agencies a 36-month transition period from the date of full AI Act application — meaning until August 2, 2029 — to bring their existing AI systems into compliance. This transition applies to AI systems already in use before the regulation's full application date of August 2, 2026.

The Art.110 transition is more generous than the Art.108 transition for private operators, which provides only a 2-year grace period for most systems. The additional year for EU institutions reflects the complexity of interinstitutional procurement processes and the need for Interinstitutional AI Committee coordination.

What the Transition Does and Does Not Cover:

The Art.110 transition delays compliance obligations for existing systems — it does not exempt EU institutions from:

If an EU institution initiates a new AI procurement after August 2, 2026, the Art.110 transition does not apply — the procured system must comply with all applicable requirements from day one.

Substantial Modification During the Transition:

Like the private-sector Art.108 transition, the Art.110 transition ends when an AI system undergoes "substantial modification" — a significant change to design, purpose, or behavior that goes beyond ordinary maintenance and updates. EU institutions need clear internal governance to distinguish maintenance from substantial modification during the 2026–2029 transition period.

CLOUD Act Exposure in EU Institutional AI Deployments

EU institutions themselves are EU legal entities operating under EU law — they are not directly subject to US jurisdiction claims under the CLOUD Act. However, the infrastructure contractors, cloud providers, and AI platform vendors serving EU institutions frequently are.

The Contractor Exposure Chain:

When an EU institution deploys an AI system hosted on infrastructure provided by a US-incorporated cloud vendor (AWS, Azure, Google Cloud), that vendor is subject to US government access requests under the CLOUD Act for data processed on their infrastructure — including logs, model outputs, training data derivatives, and system documentation generated during EU institution AI deployments.

The EDPS has long taken the position that EU institutions must ensure their data processing meets the data protection requirements of Regulation 2018/1725. AI Act compliance adds a parallel dimension: can an EU institution demonstrate Art.11 technical documentation integrity and Art.12 logging accuracy when the underlying infrastructure is subject to undisclosed US government access requests?

The EU-Native Infrastructure Advantage for EU Institution Procurement:

For AI systems processing sensitive data — Europol law enforcement data, EMA clinical trial information, FRONTEX border management data, ECB supervisory data — EU institutions face heightened scrutiny on infrastructure sovereignty. AI vendors who can demonstrate EU-incorporated infrastructure without US parent entities (and therefore without CLOUD Act exposure) have a compliance architecture advantage in EU institutional procurement:

For AI vendors building EU institutional market strategies, EU-native deployment infrastructure is increasingly a procurement requirement, not just a differentiator.

Python Tooling: Art100ComplianceTracker

from dataclasses import dataclass, field
from enum import Enum
from typing import Optional
from datetime import date, timedelta


class InstitutionType(Enum):
    TREATY_INSTITUTION = "treaty_institution"
    EU_AGENCY = "eu_agency"
    INTERINSTITUTIONAL_BODY = "interinstitutional_body"


class AISystemRole(Enum):
    DEPLOYER = "deployer"           # procured from external provider
    PROVIDER_DEPLOYER = "provider_deployer"  # built and deployed internally
    PROVIDER_ONLY = "provider_only"  # built for other institutions


class ComplianceStatus(Enum):
    COMPLIANT = "compliant"
    IN_PROGRESS = "in_progress"
    NON_COMPLIANT = "non_compliant"
    TRANSITION_PERIOD = "transition_period"  # Art.110 grace
    NOT_APPLICABLE = "not_applicable"


@dataclass
class Art100RiskAssessment:
    institution_name: str
    institution_type: InstitutionType
    ai_system_name: str
    role: AISystemRole
    is_high_risk_annex_iii: bool
    deployment_date: Optional[date]
    is_prohibited_practice: bool = False
    
    # Compliance dimensions
    technical_documentation: ComplianceStatus = ComplianceStatus.IN_PROGRESS
    human_oversight: ComplianceStatus = ComplianceStatus.IN_PROGRESS
    fria_completed: ComplianceStatus = ComplianceStatus.IN_PROGRESS
    logging_monitoring: ComplianceStatus = ComplianceStatus.IN_PROGRESS
    eu_database_registered: ComplianceStatus = ComplianceStatus.IN_PROGRESS
    incident_reporting_ready: ComplianceStatus = ComplianceStatus.IN_PROGRESS
    cloud_jurisdiction: str = "EU-native"
    
    ART110_TRANSITION_END = date(2029, 8, 2)
    ART100_APPLICATION = date(2026, 8, 2)
    
    def is_in_art110_transition(self) -> bool:
        """System deployed before Aug 2026 is in Art.110 transition period."""
        if self.deployment_date is None:
            return False
        return (
            self.deployment_date < self.ART100_APPLICATION
            and date.today() < self.ART110_TRANSITION_END
        )
    
    def days_until_transition_end(self) -> int:
        return (self.ART110_TRANSITION_END - date.today()).days
    
    def max_fine_tier1(self, annual_budget_eur: float) -> float:
        """Art.100(2) Tier 1: prohibited practices — 7% of annual budget or €35M."""
        return max(annual_budget_eur * 0.07, 35_000_000)
    
    def max_fine_tier2(self, annual_budget_eur: float) -> float:
        """Art.100(2) Tier 2: other obligations — 3% of annual budget or €15M."""
        return max(annual_budget_eur * 0.03, 15_000_000)
    
    def max_fine_tier3(self, annual_budget_eur: float) -> float:
        """Art.100(2) Tier 3: misleading EDPS — 1.5% of annual budget or €7.5M."""
        return max(annual_budget_eur * 0.015, 7_500_000)
    
    def enforcement_authority(self) -> str:
        return "EDPS (European Data Protection Supervisor)"
    
    def appeals_jurisdiction(self) -> str:
        return "Court of Justice of the EU (CJEU)"
    
    def cloud_act_risk(self) -> str:
        if "US" in self.cloud_jurisdiction or "us-" in self.cloud_jurisdiction.lower():
            return "HIGH — infrastructure subject to US CLOUD Act access requests"
        elif "EU-native" in self.cloud_jurisdiction:
            return "LOW — EU-incorporated infrastructure, no CLOUD Act exposure"
        return "MEDIUM — review cloud provider corporate structure"
    
    def compliance_score(self) -> dict:
        dimensions = [
            self.technical_documentation,
            self.human_oversight,
            self.fria_completed,
            self.logging_monitoring,
            self.eu_database_registered,
            self.incident_reporting_ready,
        ]
        compliant = sum(1 for d in dimensions if d == ComplianceStatus.COMPLIANT)
        return {
            "score": f"{compliant}/{len(dimensions)}",
            "percentage": round(compliant / len(dimensions) * 100),
            "gaps": [d.value for d in dimensions if d != ComplianceStatus.COMPLIANT],
        }
    
    def generate_edps_readiness_report(self, annual_budget_eur: float) -> str:
        score = self.compliance_score()
        transition_note = ""
        if self.is_in_art110_transition():
            transition_note = (
                f"\n⏳ Art.110 TRANSITION: {self.days_until_transition_end()} days "
                f"remaining (ends {self.ART110_TRANSITION_END})"
            )
        
        return f"""
=== Art.100 EDPS Readiness Report: {self.institution_name} ===
AI System: {self.ai_system_name}
Role: {self.role.value}
Enforcement Authority: {self.enforcement_authority()}
Appeals: {self.appeals_jurisdiction()}

COMPLIANCE SCORE: {score['score']} ({score['percentage']}%)
Gaps: {', '.join(score['gaps']) if score['gaps'] else 'None'}
{transition_note}

FINE EXPOSURE (annual budget: €{annual_budget_eur:,.0f}):
  Tier 1 (prohibited practices): up to €{self.max_fine_tier1(annual_budget_eur):,.0f}
  Tier 2 (other obligations):    up to €{self.max_fine_tier2(annual_budget_eur):,.0f}
  Tier 3 (misleading EDPS):      up to €{self.max_fine_tier3(annual_budget_eur):,.0f}

CLOUD ACT RISK: {self.cloud_act_risk()}
FRIA Required: {'YES (Annex III)' if self.is_high_risk_annex_iii else 'NO'}
"""


def assess_eu_institution_portfolio(
    systems: list[Art100RiskAssessment],
    annual_budget_eur: float,
) -> str:
    """Portfolio-level Art.100 readiness summary for an EU institution."""
    high_risk = [s for s in systems if s.is_high_risk_annex_iii]
    prohibited_risk = [s for s in systems if s.is_prohibited_practice]
    transition = [s for s in systems if s.is_in_art110_transition()]
    
    cloud_risk = [
        s for s in systems
        if s.cloud_act_risk().startswith("HIGH")
    ]
    
    return f"""
=== EU Institution AI Portfolio: Art.100 Assessment ===
Total AI Systems: {len(systems)}
High-Risk (Annex III): {len(high_risk)}
Prohibited Practice Risk: {len(prohibited_risk)}
In Art.110 Transition: {len(transition)}
High CLOUD Act Risk: {len(cloud_risk)}

Annual Budget: €{annual_budget_eur:,.0f}
Max Institutional Fine Exposure:
  Tier 1: €{max(annual_budget_eur * 0.07, 35_000_000):,.0f}
  Tier 2: €{max(annual_budget_eur * 0.03, 15_000_000):,.0f}

Enforcement Authority: EDPS
Appeals: CJEU
"""

Art.100 vs Art.99: Key Operational Differences

DimensionArt.99 (Private Operators)Art.100 (EU Institutions)
Competent authorityNational Market Surveillance AuthorityEuropean Data Protection Supervisor (EDPS)
Fine basisWorldwide annual turnoverTotal annual budget
Appeal pathNational courts → CJEUCJEU directly
Transitional periodArt.108 (2 years for Annex III)Art.110 (3 years = until 2029)
Supervisory styleVaries by Member StateCentralized, EDPS institutional culture
Interinstitutional coordinationVia AI BoardVia AI Board + Interinstitutional AI Committee
GPAI provider overlapGPAI provider → Art.101 AI OfficeGPAI deployer → Art.100 EDPS

25-Item EU Institutional AI Compliance Checklist

EDPS Supervisory Readiness:

Art.26 Deployer Obligations (for externally procured systems):

Art.26(7) FRIA Documentation:

Art.110 Transition Management:

Procurement and Vendor Management:

Infrastructure and CLOUD Act:

See Also