2026-04-14·12 min read·sota.io team

EU AI Act Art.100: Administrative Fines for Union Institutions — EDPS Enforcement Developer Guide (2026)

The EU AI Act's enforcement architecture is built around three parallel tracks. Article 99 covers private sector entities — high-risk AI providers, deployers, importers, and distributors — with national market surveillance authorities as the enforcement arm. Article 101 covers general-purpose AI model providers specifically, with the AI Office as the enforcement arm. And Article 100 covers a third category that could easily be overlooked: EU institutions, bodies, offices, and agencies themselves.

When the European Commission deploys a high-risk AI system for internal use, when Frontex uses an AI system for border surveillance, when the European Banking Authority builds AI tools for market monitoring, or when Europol uses AI for law enforcement support — those deployments are not exempt from the AI Act. Article 100 ensures that EU institutions face the same compliance obligations as the private sector, and designates the European Data Protection Supervisor (EDPS) as the competent supervisory authority in place of national MSAs.

For developers selling AI systems to EU institutions, Art.100 is directly commercially relevant. Your EU institutional customer faces EDPS enforcement for AI Act violations, which means their procurement requirements and compliance due diligence will reflect those obligations. Understanding Art.100 is understanding the compliance framework your government customers operate under.

What Article 100 Actually Says

Article 100 is structurally economical — it integrates EU institutions into the AI Act enforcement framework by cross-reference rather than by duplicating the full Art.99 penalty regime. The core mechanism operates in three layers:

Layer 1 — EDPS as Competent Authority: For Union institutions, bodies, offices, and agencies falling within the scope of the AI Act, the EDPS is designated as the competent supervisory authority. The EDPS exercises the supervisory and enforcement functions that national market surveillance authorities exercise under Art.74 for the private sector.

Layer 2 — Art.99 Fine Framework Applies: The EDPS is empowered to impose administrative fines under conditions equivalent to those applicable under Art.99. This means the same fine calculation framework, the same mitigating and aggravating factors, and the same procedural protections apply when the EDPS enforces against EU institutions as when national MSAs enforce against private entities.

Layer 3 — Coordination with National Authorities: Where enforcement requires coordination between the EDPS and national competent authorities — for example, when an EU agency deploys an AI system that also has Member State operational impact — the EDPS coordinates with relevant national authorities using the frameworks established in Chapter VII.

Which EU Institutions Are Covered

Article 100 applies to the full range of Union institutions, bodies, offices, and agencies. This is a broad category that encompasses:

EU Institutions (Treaty-Based):

EU Bodies and Offices:

EU Agencies with High-Risk AI Exposure:

Why Agencies Matter Most: The operational agencies — Frontex, Europol, EUAA — are the EU institutions with the highest likelihood of deploying AI systems that fall into Annex III high-risk categories. Frontex's border management tools, Europol's law enforcement analytics, and EUAA's asylum processing tools all touch Annex III categories 6 (law enforcement), 7 (migration and asylum), and 1 (biometric identification). These agencies are the primary targets of Art.100 enforcement in practice.

The EDPS as Art.100 Supervisory Authority

The European Data Protection Supervisor is not a new institution created by the AI Act. The EDPS has been the supervisory authority for personal data processing by EU institutions under Regulation (EU) 2018/1725 — the EU institutions' equivalent of GDPR — since 2004. Designating the EDPS as the AI Act supervisory authority for EU institutions is architecturally coherent: the EDPS already has supervisory access to EU institutional operations, audit powers, and institutional relationships that national MSAs do not.

EDPS Enforcement Powers Under Art.100:

The EDPS exercises supervisory powers equivalent to those national MSAs have under Art.74-85 of the AI Act:

EDPS Supervision vs GDPR Supervision:

A critical practical point for developers: the EDPS exercises two distinct supervisory roles simultaneously for EU institutions. Under Regulation 2018/1725, the EDPS supervises personal data processing. Under Art.100 of the AI Act, the EDPS supervises AI system compliance. For high-risk AI systems that process personal data — which is most of them — both regulatory frameworks apply simultaneously. An EDPS audit of an EU institutional AI system will assess both AI Act compliance and Regulation 2018/1725 compliance in the same review.

This dual supervision means that EU institutional AI deployments face a higher compliance documentation burden than comparable private sector deployments, because they must satisfy both the AI Act's technical documentation requirements (Annex IV) and the EDPS's data protection impact assessment requirements under Regulation 2018/1725.

Fine Structure Under Art.100

The Art.100 fine framework mirrors Art.99, with the EDPS exercising the same discretion as national MSAs in the private sector enforcement context.

Fine Tiers (Art.99 Framework Applied to EU Institutions):

Violation CategoryMaximum Fine
Prohibited practices (Annex VI AI practices)€35,000,000 or 7% of total annual budget
High-risk AI obligations violations (Arts. 9–15, 17, 20, 22)€15,000,000 or 3% of total annual budget
Provision of incorrect/misleading information to EDPS€7,500,000 or 1.5% of total annual budget

Budget vs Turnover: For EU institutions, annual budget replaces annual turnover as the reference metric. The European Commission's annual budget is approximately €185 billion — meaning maximum Art.100 fines for Commission violations could theoretically exceed €5.5 billion at the 3% tier. In practice, fines at this level are politically and institutionally implausible. The more likely enforcement outcomes for EU institutions are corrective orders, operational restrictions, and moderate fines that create reputational and compliance pressure rather than existential financial consequences.

Art.100 Calculation Factors: The same factors that govern Art.99 fine calculations apply:

Art.100 vs Art.99 vs Art.101: The Complete Enforcement Matrix

DimensionArt.99Art.100Art.101
Enforcement authorityNational MSAEDPSAI Office
Subject entitiesPrivate sector providers, deployers, importersEU institutions, bodies, agenciesGPAI model providers
Primary obligationsArts. 9–15 (high-risk AI requirements)Arts. 9–15 (same, but for institutional use)Art.53 (GPAI transparency), Art.55 (systemic risk)
Reference metric for finesAnnual turnoverAnnual budgetAnnual turnover
Appeal pathNational courtsCJEU / EU administrative tribunalsGeneral Court of the EU
Coordination mechanismJoint MSA coordination (Art.74)EDPS + national authoritiesAI Office + national authorities (Art.88)
Mitigation pathwayArt.94 commitmentsEDPS corrective orders + compliance plansArt.94 commitments

The Institutional Gap That Art.100 Closes:

Without Art.100, EU institutions using AI systems would face a perverse asymmetry: private companies deploying the same AI would be subject to national MSA enforcement, but EU institutions would face no equivalent enforcement mechanism. Art.100 eliminates this gap explicitly. An EU institution cannot invoke institutional status, sovereign immunity, or inter-institutional comity to avoid AI Act compliance. The EDPS is empowered to enforce the same compliance requirements with the same investigative tools and the same ultimate penalty authority.

What Art.100 Means If You Sell AI to EU Institutions

For developers and technology companies that sell AI systems to EU institutions, Art.100 creates a specific commercial and compliance dynamic that differs from selling to private sector customers.

Procurement Due Diligence: EU institutions subject to Art.100 enforcement will require their AI suppliers to demonstrate AI Act compliance as part of procurement. This means technical documentation in Annex IV format, conformity assessments for Annex III systems, EU declarations of conformity, and clear evidence of post-market monitoring obligations being met. Government AI procurement contracts will increasingly include AI Act compliance warranties, audit rights, and liability clauses tied to Art.100 enforcement exposure.

Provider vs Deployer Obligations: When you sell an AI system to an EU institution, the institution typically acts as the deployer under Art.3(4). As provider, you retain the obligations in Art.16 — including technical documentation, conformity assessment, CE marking, and the EU database registration for high-risk systems. The EU institution as deployer has the Art.26 obligations — appropriate use, human oversight, monitoring, data governance. Art.100 enforcement by the EDPS focuses on the institution's deployer obligations, but your provider obligations remain subject to national MSA enforcement (or AI Office enforcement if you're also a GPAI model provider).

Contractual Risk Allocation: EU institutional procurement contracts increasingly include compliance representations tied to AI Act obligations. If the EU institution faces an EDPS investigation triggered in part by your failure to provide adequate technical documentation or post-market monitoring support as the system provider, you may face contractual liability even though Art.100 directly applies to the institution rather than to you.

EDPS As Technical Interlocutor: Unlike many national MSAs that are primarily legal and regulatory bodies, the EDPS has technical expertise in data protection by design and privacy engineering that extends into AI systems. EDPS investigations under Art.100 are likely to be technically substantive — reviewing model documentation, validation methodologies, bias testing, and data governance — rather than primarily procedural. Developers selling to EU institutions should expect technically rigorous compliance assessments.

The CLOUD Act Intersection for EU Institutional AI

EU institutions using US-hosted cloud infrastructure for their AI systems face the same CLOUD Act conflict as private sector entities — but with an additional institutional dimension.

The EDPS Position on US Cloud Services: The EDPS has historically taken a strong position on EU institution use of US cloud services, particularly following the Schrems II invalidation of Privacy Shield and ongoing uncertainty around the EU-US Data Privacy Framework. Under Art.100 enforcement, the EDPS is likely to scrutinize AI systems hosted on US infrastructure for both AI Act compliance documentation integrity and data protection compliance — creating compounded exposure.

CLOUD Act → Art.100 Enforcement Chain:

  1. EU institution deploys high-risk AI system with training data and model weights on US cloud infrastructure
  2. EDPS issues Art.100 investigation requesting technical documentation and access to model records
  3. Simultaneously, US government issues CLOUD Act order to the US cloud provider
  4. EU institution faces conflicting obligations: comply with EDPS request (EU law) vs. compliance with CLOUD Act order (US extraterritorial jurisdiction)
  5. Any inability to provide the EDPS with full access to documentation constitutes an Art.100 procedural violation

EU-sovereign infrastructure eliminates this conflict. For EU institutions deploying AI systems, using infrastructure that is not reachable by CLOUD Act orders — such as EU-only infrastructure with no US parent company — eliminates the dual-jurisdiction problem and simplifies Art.100 compliance documentation maintenance.

Python Tooling for Art.100 Compliance Tracking

from dataclasses import dataclass, field
from typing import Optional
from enum import Enum
from datetime import date

class EUInstitutionType(Enum):
    COMMISSION = "commission"
    PARLIAMENT = "parliament"
    COUNCIL = "council"
    AGENCY = "agency"          # Frontex, Europol, EMA, EASA, etc.
    BODY = "body"              # EIB, ECB, FRA, etc.
    OFFICE = "office"          # EUIPO, EPPO, etc.

class Art100EnforcementTrack(Enum):
    NOT_APPLICABLE = "not_applicable"   # System not in scope of AI Act
    STANDARD = "standard"              # High-risk AI, Art.99 framework via EDPS
    PROHIBITED = "prohibited"          # Annex VI prohibited practice
    INFORMATION = "information"        # Incorrect/misleading information to EDPS

@dataclass
class Art100ComplianceProfile:
    """Model Art.100 compliance status for an EU institutional AI deployment."""
    
    institution_name: str
    institution_type: EUInstitutionType
    ai_system_name: str
    annex_iii_category: Optional[str]  # e.g., "6a - law enforcement biometric"
    is_high_risk: bool
    
    # Infrastructure
    cloud_provider: str
    cloud_jurisdiction: str  # "eu_only", "us_hyperscaler_eu_region", "us_hosted"
    
    # Documentation status
    annex_iv_documentation_complete: bool = False
    conformity_assessment_complete: bool = False
    eu_declaration_of_conformity: bool = False
    eu_database_registered: bool = False
    
    # Operational status
    human_oversight_implemented: bool = False
    post_market_monitoring_active: bool = False
    edps_notification_filed: bool = False
    
    # DPIA under Regulation 2018/1725
    dpia_completed: bool = False
    dpia_date: Optional[date] = None
    
    def annual_budget_reference(self) -> str:
        """Reference metric for Art.100 fine calculation."""
        budget_references = {
            EUInstitutionType.COMMISSION: "EUR ~185B (EU Commission annual budget)",
            EUInstitutionType.PARLIAMENT: "EUR ~2.3B (Parliament annual budget)",
            EUInstitutionType.AGENCY: "EUR varies (agency-specific operating budget)",
            EUInstitutionType.BODY: "EUR varies (body-specific budget)",
            EUInstitutionType.COUNCIL: "EUR ~600M (Council Secretariat budget)",
            EUInstitutionType.OFFICE: "EUR varies (office-specific budget)",
        }
        return budget_references.get(self.institution_type, "EUR varies")
    
    def cloud_act_risk(self) -> str:
        """Assess CLOUD Act conflict risk for Art.100 documentation access."""
        if self.cloud_jurisdiction == "eu_only":
            return "LOW — EU-only infrastructure, no CLOUD Act reach"
        elif self.cloud_jurisdiction == "us_hyperscaler_eu_region":
            return "ELEVATED — US parent company, CLOUD Act potentially reaches EU-region data"
        else:
            return "HIGH — US-hosted infrastructure, CLOUD Act directly applicable"
    
    def compliance_gaps(self) -> list[str]:
        """Identify Art.100 compliance gaps for this institutional deployment."""
        gaps = []
        if self.is_high_risk:
            if not self.annex_iv_documentation_complete:
                gaps.append("Annex IV technical documentation incomplete — Art.16(a) violation risk")
            if not self.conformity_assessment_complete:
                gaps.append("Conformity assessment not completed — Art.16(d) violation risk")
            if not self.eu_declaration_of_conformity:
                gaps.append("EU Declaration of Conformity not issued — Art.48 violation risk")
            if not self.eu_database_registered:
                gaps.append("EU database registration missing — Art.49 violation risk")
            if not self.human_oversight_implemented:
                gaps.append("Human oversight not configured — Art.14 violation risk")
            if not self.post_market_monitoring_active:
                gaps.append("Post-market monitoring not active — Art.18 violation risk")
        if not self.dpia_completed and self.is_high_risk:
            gaps.append("DPIA under Regulation 2018/1725 not completed — compounded EDPS exposure")
        if self.cloud_jurisdiction != "eu_only":
            gaps.append(f"Cloud jurisdiction: {self.cloud_act_risk()}")
        return gaps
    
    def edps_enforcement_risk(self) -> str:
        """Overall Art.100 enforcement risk assessment."""
        gap_count = len(self.compliance_gaps())
        if gap_count == 0:
            return "LOW — Full Art.100 compliance"
        elif gap_count <= 2:
            return f"MEDIUM — {gap_count} gap(s) identified, correctable"
        else:
            return f"HIGH — {gap_count} gap(s) identified, material EDPS enforcement exposure"


@dataclass
class Art100InstitutionalAudit:
    """Run Art.100 compliance audit across multiple EU institutional AI deployments."""
    
    institution: str
    audit_date: date
    systems: list[Art100ComplianceProfile] = field(default_factory=list)
    
    def high_risk_systems(self) -> list[Art100ComplianceProfile]:
        return [s for s in self.systems if s.is_high_risk]
    
    def systems_with_gaps(self) -> list[Art100ComplianceProfile]:
        return [s for s in self.systems if s.compliance_gaps()]
    
    def cloud_act_exposed_systems(self) -> list[Art100ComplianceProfile]:
        return [s for s in self.systems 
                if s.cloud_jurisdiction != "eu_only"]
    
    def audit_summary(self) -> dict:
        return {
            "institution": self.institution,
            "audit_date": str(self.audit_date),
            "total_systems": len(self.systems),
            "high_risk_count": len(self.high_risk_systems()),
            "systems_with_gaps": len(self.systems_with_gaps()),
            "cloud_act_exposed": len(self.cloud_act_exposed_systems()),
            "overall_risk": "HIGH" if len(self.systems_with_gaps()) > len(self.systems) // 2 else "MEDIUM" if self.systems_with_gaps() else "LOW"
        }

Art.100 in the Context of Art.22 — Fundamental Rights Impact

Article 22 requires deployers of high-risk AI systems to conduct a fundamental rights impact assessment before deployment in certain public sector contexts. For EU institutions deploying Annex III AI systems — particularly Frontex (border management), Europol (law enforcement), and EUAA (asylum processing) — Art.22 obligations compound Art.100 enforcement exposure.

An EU institution that deploys a high-risk AI system without completing the Art.22 fundamental rights impact assessment is simultaneously:

For developers selling AI to Frontex, Europol, or EUAA, the Art.22 fundamental rights assessment requirement means your technical documentation must be sufficiently detailed to support your institutional customer's impact assessment. Incomplete Annex IV documentation that prevents the institution from completing its Art.22 assessment is a supply chain problem that flows back to your provider obligations.

The 30-Item Art.100 Institutional AI Compliance Checklist

EDPS Supervisory Scope (Items 1–5)

  1. Confirm your EU institution or agency falls within Art.100 scope (virtually all EUIs do)
  2. Identify all AI systems currently deployed or planned for deployment — classify each against Annex III
  3. Determine which systems are high-risk — document the classification rationale in writing
  4. Assign a designated AI Act compliance officer with direct EDPS liaison responsibility
  5. Establish a register of all AI systems in use (pre-condition for EDPS audit readiness)

Provider Obligations — If Procuring External AI Systems (Items 6–10)

  1. Verify that AI system providers have completed Annex IV technical documentation
  2. Confirm conformity assessments are completed and EU declarations of conformity are available
  3. Check CE marking presence on all applicable high-risk AI systems
  4. Verify EU database registration for high-risk systems under Art.49
  5. Include AI Act compliance warranties and audit rights in procurement contracts

Deployer Obligations — Operational Compliance (Items 11–17)

  1. Implement human oversight measures per Art.14 for all high-risk AI systems
  2. Configure the input monitoring required by Art.26(5) — detect and flag out-of-scope inputs
  3. Establish post-market monitoring under Art.18 — define metrics, collection methods, review cycle
  4. Document the intended purpose of each AI system in alignment with the provider's instructions
  5. Train staff who operate or oversee high-risk AI systems (Art.26(6) training obligation)
  6. Implement logging and audit trail capabilities for all high-risk AI decision outputs
  7. Establish the serious incident notification pipeline to the EDPS

Fundamental Rights and DPIA (Items 18–21)

  1. Complete Art.22 fundamental rights impact assessment for applicable high-risk deployments
  2. Complete DPIA under Regulation 2018/1725 for AI systems processing personal data
  3. Document the legal basis for personal data processing in each AI system
  4. Ensure data subject rights (access, rectification, objection) are technically implementable

Infrastructure and CLOUD Act Risk (Items 22–25)

  1. Audit cloud infrastructure jurisdiction for all AI systems — identify US-hosted components
  2. For US-hosted AI infrastructure: assess CLOUD Act conflict risk with EDPS documentation requests
  3. Evaluate migration feasibility to EU-sovereign infrastructure for highest-risk systems
  4. Document infrastructure jurisdiction in technical documentation (Annex IV requirement)

EDPS Engagement and Audit Readiness (Items 26–30)

  1. Establish proactive EDPS engagement — consider prior consultation for novel or high-risk deployments
  2. Create an Art.100 investigation response playbook — who speaks to EDPS, what documents are produced
  3. Review historical EDPS opinions on EU institutional AI use for compliance signals
  4. Implement internal compliance review cycle aligned with EDPS supervisory calendar
  5. Monitor EDPS enforcement actions against other EU institutions — extract cross-institutional learnings

Why Art.100 Matters Beyond EU Institutions

Article 100 matters to any developer working in the government AI space — not just those building specifically for EU institutions. The Art.100 enforcement framework shapes:

Procurement requirements cascade: EU institutional compliance requirements under Art.100 translate directly into supplier obligations. If Europol must demonstrate full AI Act compliance to the EDPS, Europol's AI suppliers must provide the documentation Europol needs to achieve that compliance.

Standards setting: EDPS enforcement opinions and decisions under Art.100 create interpretive precedents that influence how national MSAs enforce Art.99 for the private sector. EDPS guidance on what constitutes adequate technical documentation, valid fundamental rights assessments, and appropriate human oversight becomes de facto industry standard.

CLOUD Act policy formation: EDPS positions on EU institutional cloud use — developed in the context of Art.100 enforcement — feed directly into EU cloud policy, EU institutional procurement rules, and ultimately into the market incentives that make EU-sovereign infrastructure commercially significant.

Article 100 completes the EU AI Act's enforcement architecture. Art.99 covered private actors. Art.101 covered GPAI model providers. Art.100 closes the final gap by ensuring that the EU itself — its agencies, bodies, and institutions — faces the same compliance obligations as the entities it regulates.