2026-04-24·13 min read·sota.io team

EU AI Act Art.57: National Competent Authorities — Designation, Tasks, Independence, and Resources (2026)

EU AI Act Article 57 opens Chapter VI (Governance) by establishing how EU Member States must set up the National Competent Authorities (NCAs) responsible for supervising and enforcing the Regulation at national level. Where Chapters I–V define what the AI Act requires of providers, deployers, and importers, Chapter VI defines who enforces it and how the enforcement apparatus is structured.

For AI developers and infrastructure providers operating in the EU, Art.57 answers a foundational question: which authority supervises your AI system, and under which jurisdiction? The answer depends on whether your system is a general-purpose AI (GPAI) model, a high-risk AI system, or a prohibited-use system — and which Member State your principal place of establishment is in.

Art.57 became applicable on 2 August 2025 as part of the phased entry into force of Regulation (EU) 2024/1689. Member States were required to notify the Commission of their NCA designations by that same date.

For EU infrastructure providers and PaaS operators — including sota.io — Art.57 has direct relevance: operators with EU-based infrastructure operate under a single NCA jurisdiction, avoiding the cross-border enforcement complexity that US-incorporated cloud providers face when regulatory requests arrive from multiple national authorities simultaneously.


Art.57 in the Chapter VI Governance Architecture

Chapter VI establishes the full multi-level governance framework of the AI Act:

ArticleTitleFunction
Art.57National Competent AuthoritiesDesignates who enforces the Act nationally
Art.58NCA powersDefines what NCAs can do (access, inspect, sanction)
Art.59AI BoardCoordinates NCAs at EU level; issues guidelines
Art.60EU AI databasePublic registry of high-risk AI systems
Art.61AI OfficeUnion-level enforcement for GPAI models
Art.62Scientific panelIndependent expert body supporting AI Office
Art.63Advisory forumIndustry + civil society input to governance
Art.64ConfidentialityInformation-handling obligations for authorities
Art.65Information sharingNCA-to-NCA data exchange

Art.57 is the foundation of this architecture: without designated NCAs, the enforcement chain from Art.58 onwards cannot operate. It also defines the GPAI carve-out — the structural reason why the AI Office (Art.61) rather than national NCAs has primary supervisory responsibility for GPAI models.


Art.57(1): Mandatory NCA Designation

Art.57(1) requires each Member State to designate or establish at least one national competent authority for the purpose of ensuring the application and implementation of the AI Act. The designation must identify, at minimum:

These two functions are structurally distinct:

FunctionScopeCounterpart
Market surveillancePost-market, in-service AI systemsArt.74–77 (market surveillance procedures)
Notifying authorityPre-market, CAB accreditationArt.28–35 (notified bodies + conformity assessment)

Art.57(1) does not require these to be different authorities — it requires at least one NCA, which may hold both mandates.


Art.57(2): Single-Authority Designation

Art.57(2) explicitly permits a Member State to designate one single national competent authority to perform both functions — acting simultaneously as market surveillance authority and notifying authority. This provision reflects the legislative compromise between large Member States (who may prefer functional specialisation) and smaller ones (for whom administrative efficiency matters more).

The single-authority model has tradeoffs:

FactorSingle NCADual NCA
Administrative efficiencyHigherLower
SpecialisationLowerHigher
Cross-function conflicts of interestPossibleReduced
Resource requirementsLowerHigher
Coordination overheadLowerHigher

As of August 2025, the majority of Member States opted for a single NCA or a lead-NCA model where one authority coordinates the other. Pure functional separation remains the exception.


Art.57(3): Designation of Existing Authorities

Art.57(3) permits Member States to designate an existing authority as NCA — the AI Act does not require the creation of a new body. This provision enables Member States to leverage established regulatory institutions:

Data protection authorities (DPAs): Member States with strong DPA traditions (Germany: BfDI + Landesbehörden, Austria: DSB, Netherlands: AP) may assign AI Act market surveillance functions to existing DPAs, reflecting the deep overlap between GDPR compliance and AI system deployment.

Telecommunications regulators: Authorities like Germany's Bundesnetzagentur or France's Arcep have existing market surveillance competence in technology sectors and existing inspector workforces.

Financial regulators: For AI systems in the financial sector (credit scoring, algorithmic trading), Member States may co-designate or coordinate with BaFin (Germany), AMF (France), or FCA-equivalent bodies.

Consumer protection authorities: For AI systems in consumer-facing contexts, national consumer protection bodies may form part of the NCA structure.

Art.57(3) also contains a DPA coordination requirement: when the designated NCA is not a data protection authority, it must coordinate with national DPAs, particularly where AI systems process personal data. This creates a mandatory liaison channel between AI Act and GDPR enforcement.

For developers, this means:


Art.57(4): Independence Requirements

Art.57(4) mandates that NCAs exercise their powers independently, impartially, and without bias. The independence requirement has both structural and operational dimensions:

Structural independence: NCAs must not be subject to instructions from market participants or government departments with regulatory interests in AI. This is analogous to the GDPR requirement for DPA independence under Art.52 GDPR.

Operational independence: NCA decisions — enforcement actions, sanctions, access orders — must not be subject to prior approval from ministries or industry bodies.

Member conflicts: NCA staff must refrain from actions incompatible with their duties. The Regulation references occupation restrictions to prevent regulatory capture — NCA personnel may not, during their term, hold positions that create conflicts with NCA oversight responsibilities.

The independence requirement directly shapes the NCA's enforcement credibility. An NCA subject to industry influence cannot credibly enforce the prohibition provisions of Art.5 or the conformity assessment requirements of Art.43.


Art.57(5): Resource Requirements

Art.57(5) requires Member States to provide NCAs with adequate human, technical, and financial resources and infrastructure to effectively fulfil their tasks. This provision was heavily negotiated — industry and civil society observers both flagged that under-resourced NCAs would make the AI Act unenforceable.

The resource requirement is self-updating: NCAs must regularly review and, where necessary, update the size and competences of their human resources to reflect changes and needs. As the AI system landscape evolves, NCAs are required to adapt.

Key resource dimensions:

Resource typeRequirementPractical challenge
HumanAI expertise, legal + technical staffCompetitive salary pressure vs. private sector
TechnicalTesting infrastructure, model evaluation toolsGPAI evaluation requires GPU access, benchmarking
FinancialOperating budget independent of industry feesFee-funded models risk regulatory capture
InfrastructureSecure data handling, whistleblower channelsSensitive technical documentation access

Art.57(5) specifically notes that national authorities responsible for data protection oversight — i.e., DPAs — must be provided the necessary resources to effectively carry out their tasks in relation to the AI Act. This is a direct acknowledgment that DPAs cannot simply absorb AI Act oversight within existing GDPR budgets; additional funding is expected.


Art.57(6): Notification to the Commission

Art.57(6) imposed a hard deadline: Member States were required to notify the Commission of their NCA designations by 2 August 2025. Subsequent changes to designations must be notified without undue delay.

The Commission maintains a public list of NCAs kept up to date. As of 2025-2026, this list serves as the authoritative source for:

For developers selecting EU deployment locations, the NCA list is a due-diligence input: early-designating Member States with adequately resourced NCAs signal stronger enforcement credibility, which matters both for compliance certainty and for competitive positioning in regulated sectors.


Art.57(7): Single Point of Contact

Art.57(7) requires each Member State to designate a single point of contact (SPoC) for communication with the Commission as regards the AI Act. The SPoC function is distinct from the NCA:

RoleFunction
NCAEnforces the AI Act against operators within its jurisdiction
SPoCChannels information between the Commission and the Member State

The SPoC is particularly relevant when the Commission requests information about national implementation, statistics on notifications, or clarifications about national NCA decisions. For most operators, direct interaction with the SPoC is rare — it is primarily an inter-institutional channel.


Art.57(8): Cross-Border Coordination

Art.57(8) requires NCAs to regularly exchange information and best practices with each other and, where relevant, consult:

This coordination requirement reflects the structural reality of modern AI deployment: an AI system may simultaneously implicate GDPR (DPA), product safety law (market surveillance), financial services law (financial regulator), and consumer protection law (consumer authority). NCAs cannot enforce in isolation.

The coordination obligation does not create a supervisory hierarchy — each authority retains its own competence. It creates mandatory consultation channels that operators should account for: an enforcement action by the NCA may trigger parallel proceedings by a DPA or consumer authority.

For infrastructure providers, cross-border coordination means that a NCA investigation in one Member State may involve requests to NCAs in other Member States where data processing occurs. EU-incorporated operators with infrastructure in a single jurisdiction have clearer enforcement geography.


Art.57(9): GPAI Carve-Out — AI Office as Supervisory Authority

Art.57(9) is the most consequential provision for GPAI model providers: it establishes the GPAI carve-out, removing GPAI models from national NCA supervision and assigning supervisory responsibility to the AI Office at Union level.

The carve-out logic:

Practical implications for GPAI developers:

If your system is...Primary supervisorNational NCA role
High-risk AI system (not GPAI)National NCAPrimary
GPAI model (Art.51 threshold)AI OfficeAdvisory/coordination
GPAI model + systemic riskAI OfficeEscalation only
High-risk AI built on top of GPAINational NCAPrimary (for downstream system)

The carve-out does not remove all national oversight: downstream high-risk AI systems that integrate a GPAI model remain under national NCA supervision. The NCA supervises the deployer's use of the GPAI model as a component; the AI Office supervises the GPAI model provider's compliance with Art.52-56.


Real-World NCA Designations (2025-2026)

As of the August 2025 deadline, Member States had varying levels of designation completeness:

Germany: The Bundesnetzagentur (Federal Network Agency) was designated as the primary market surveillance authority, reflecting its existing competence in technical product regulation. DPA coordination channels were established with the BfDI (Federal Commissioner for Data Protection). For sector-specific AI (financial services, healthcare), parallel competences remain with BaFin and the Gemeinsamer Bundesausschuss.

France: The Autorité de régulation des communications électroniques et des postes (Arcep) was co-designated alongside CNIL for AI systems involving personal data processing. The French government also maintained a coordination role through the Direction générale des entreprises.

Ireland: The Digital Services Coordinator function was extended to cover AI Act market surveillance, reflecting Ireland's existing role as a significant EU hub for technology sector operators (Facebook, Google, Apple European headquarters). The Irish NCA's workload is disproportionately large relative to Ireland's population, given the concentration of US tech companies' EU establishments.

Netherlands: The Autoriteit Persoonsgegevens (AP, Dutch DPA) was extended to cover AI Act supervision for AI systems that process personal data. The Netherlands Consumer and Markets Authority (ACM) handles market surveillance for non-personal-data AI systems.

Sweden: The Swedish Post and Telecom Authority (PTS) was designated as the primary NCA, with coordination arrangements with the Swedish IMY (data protection authority).

For sota.io — incorporated in Germany with Frankfurt-based infrastructure — the Bundesnetzagentur is the primary NCA for AI Act market surveillance purposes. The EU infrastructure advantage here is clarity: there is no ambiguity about which authority has jurisdiction, no multi-NCA coordination complexity, and no risk of conflicting enforcement positions.


The GDPR-AI Act NCA Coordination Problem

One of the most complex practical issues emerging from Art.57(3)'s DPA coordination requirement is the jurisdictional overlap between GDPR and the AI Act for AI systems that process personal data — which, in practice, includes most commercially deployed AI systems.

The current allocation:

IssueAI Act NCAGDPR DPA
High-risk AI conformity assessmentPrimaryAdvisory
Biometric AI system (Art.5 prohibition)PrimaryAdvisory
AI system processing health dataPrimaryCo-competent
Automated decision-making Art.22 GDPRAdvisoryPrimary
Training data (personal data in training set)AdvisoryPrimary
Inference output (personal data in output)AdvisoryPrimary

The coordination requirement of Art.57(3) creates a mandatory joint investigation protocol for cases involving both AI Act and GDPR violations. Practically, this means developers facing AI Act enforcement should expect parallel GDPR inquiries, and vice versa.

The risk for US-infrastructure operators: a US cloud provider hosting both the AI model weights and user personal data creates a single CLOUD Act target for both AI Act evidence requests and GDPR data protection enforcement requests. EU-based infrastructure separates these vectors — CLOUD Act extraterritorial access does not apply to data stored and processed by EU-incorporated entities without a US parent.


Infrastructure Jurisdiction and NCA Clarity

Art.57 reinforces the infrastructure jurisdiction argument that underlies sota.io's positioning. For AI system developers choosing deployment infrastructure, the NCA designation structure has direct compliance implications:

EU-incorporated PaaS (sota.io):

US-incorporated cloud (AWS, Azure, GCP):

For GPAI model developers specifically, the AI Office (Art.61) — not the national NCA — is the primary supervisor. But the documentation, evidence records, and model weights are subject to both AI Office oversight and the jurisdiction of whatever cloud provider holds the storage.


Python Implementation: NCA Jurisdiction Tracker

from dataclasses import dataclass, field
from enum import Enum
from typing import Optional
import datetime

class SystemCategory(Enum):
    HIGH_RISK = "high_risk"
    GPAI_MODEL = "gpai_model"
    GPAI_SYSTEMIC_RISK = "gpai_systemic_risk"
    LIMITED_RISK = "limited_risk"
    MINIMAL_RISK = "minimal_risk"
    PROHIBITED = "prohibited"

class NCAPrimaryAuthority(Enum):
    NATIONAL_NCA = "national_nca"
    AI_OFFICE = "ai_office"  # Art.57(9) GPAI carve-out
    DUAL_AUTHORITY = "dual_authority"  # High-risk AI built on GPAI

@dataclass
class MemberStateNCA:
    country_code: str
    country_name: str
    market_surveillance_authority: str
    notifying_authority: str
    data_protection_authority: str
    single_point_of_contact: str
    designation_date: Optional[datetime.date] = None
    single_nca: bool = False  # Art.57(2): one authority for both functions
    notes: str = ""

@dataclass
class AISystemJurisdictionProfile:
    system_name: str
    system_category: SystemCategory
    provider_incorporation_country: str
    infrastructure_country: str
    principal_deployment_country: str
    processes_personal_data: bool = True
    uses_gpai_model: bool = False

    def primary_supervisor(self) -> NCAPrimaryAuthority:
        """Art.57(9): GPAI models → AI Office. Others → national NCA."""
        if self.system_category in (
            SystemCategory.GPAI_MODEL,
            SystemCategory.GPAI_SYSTEMIC_RISK,
        ):
            return NCAPrimaryAuthority.AI_OFFICE
        if self.uses_gpai_model and self.system_category == SystemCategory.HIGH_RISK:
            # High-risk downstream system built on GPAI: NCA for the system,
            # AI Office for the GPAI component
            return NCAPrimaryAuthority.DUAL_AUTHORITY
        return NCAPrimaryAuthority.NATIONAL_NCA

    def gdpr_dpa_coordination_required(self) -> bool:
        """Art.57(3): NCA must coordinate with DPA if AI system processes personal data."""
        return self.processes_personal_data

    def cloud_act_risk(self) -> bool:
        """
        CLOUD Act risk exists if infrastructure_country is US or if
        provider is incorporated in US — even with EU data residency.
        """
        us_jurisdictions = {"US", "USA", "United States"}
        return (
            self.infrastructure_country in us_jurisdictions
            or self.provider_incorporation_country in us_jurisdictions
        )

    def enforcement_jurisdiction_complexity(self) -> str:
        supervisor = self.primary_supervisor()
        authorities = []

        if supervisor == NCAPrimaryAuthority.NATIONAL_NCA:
            authorities.append(f"NCA ({self.principal_deployment_country})")
        elif supervisor == NCAPrimaryAuthority.AI_OFFICE:
            authorities.append("AI Office (EU level)")
        else:
            authorities.append(f"NCA ({self.principal_deployment_country}) + AI Office")

        if self.gdpr_dpa_coordination_required():
            authorities.append(f"DPA ({self.principal_deployment_country})")

        if self.cloud_act_risk():
            authorities.append("US DoJ (CLOUD Act extraterritorial)")

        return " | ".join(authorities)

    def report(self) -> str:
        lines = [
            f"System: {self.system_name}",
            f"Category: {self.system_category.value}",
            f"Primary supervisor: {self.primary_supervisor().value}",
            f"DPA coordination required: {self.gdpr_dpa_coordination_required()}",
            f"CLOUD Act risk: {self.cloud_act_risk()}",
            f"Full enforcement jurisdiction: {self.enforcement_jurisdiction_complexity()}",
        ]
        return "\n".join(lines)


# EU NCA registry (Art.57(6) — Commission-maintained public list)
EU_NCA_REGISTRY: dict[str, MemberStateNCA] = {
    "DE": MemberStateNCA(
        country_code="DE",
        country_name="Germany",
        market_surveillance_authority="Bundesnetzagentur",
        notifying_authority="Bundesnetzagentur",
        data_protection_authority="BfDI + Landesbehörden",
        single_point_of_contact="Bundesnetzagentur (SPoC)",
        designation_date=datetime.date(2025, 8, 2),
        single_nca=True,
        notes="Financial AI: BaFin coordination. Healthcare AI: BfArM coordination.",
    ),
    "FR": MemberStateNCA(
        country_code="FR",
        country_name="France",
        market_surveillance_authority="Arcep",
        notifying_authority="Arcep",
        data_protection_authority="CNIL",
        single_point_of_contact="DGE (Direction générale des entreprises)",
        designation_date=datetime.date(2025, 8, 1),
        single_nca=False,
        notes="CNIL co-competent for AI systems processing personal data.",
    ),
    "IE": MemberStateNCA(
        country_code="IE",
        country_name="Ireland",
        market_surveillance_authority="CCPC / Digital Services Coordinator",
        notifying_authority="NSAI (National Standards Authority of Ireland)",
        data_protection_authority="DPC (Data Protection Commission)",
        single_point_of_contact="Department of Enterprise, Trade and Employment",
        designation_date=datetime.date(2025, 7, 31),
        single_nca=False,
        notes="DPC has GDPR lead supervisory role for EU-established US tech companies.",
    ),
}


# Example: sota.io — EU PaaS hosted in Frankfurt, incorporated in Germany
sota_profile = AISystemJurisdictionProfile(
    system_name="sota.io PaaS",
    system_category=SystemCategory.LIMITED_RISK,
    provider_incorporation_country="DE",
    infrastructure_country="DE",
    principal_deployment_country="DE",
    processes_personal_data=True,
    uses_gpai_model=False,
)
print(sota_profile.report())

# Example: US cloud-hosted high-risk AI
us_cloud_ai = AISystemJurisdictionProfile(
    system_name="High-Risk AI on US Cloud",
    system_category=SystemCategory.HIGH_RISK,
    provider_incorporation_country="US",
    infrastructure_country="US",
    principal_deployment_country="DE",
    processes_personal_data=True,
    uses_gpai_model=True,
)
print(us_cloud_ai.report())

Art.57 Compliance Checklist

For AI system operators establishing or reviewing their EU compliance posture, Art.57 creates the following due diligence obligations:

#ActionTriggerStatus
1Identify your NCAPrincipal place of establishment in EURequired
2Confirm GPAI carve-out applicabilityGPAI model provider → AI Office, not NCARequired
3Map NCA + DPA dual-authority scopeAI system processes personal dataRequired
4Check NCA designation in Commission public listBefore market entry in new MSDue diligence
5Establish NCA contact channelBefore first market surveillance inquiryRequired
6Document SPoC identity for Commission communicationsCommission/Art.74 inquiryRequired
7Assess infrastructure jurisdictionEU vs. US cloud → CLOUD Act riskRisk management
8Review cross-border NCA coordination scopeMulti-MS deploymentRequired
9Align DPA notification channels with NCAGDPR-AI Act overlap systemsRequired
10Verify NCA has received designation notificationArt.57(6) deadline: Aug 2, 2025Verification
11Monitor NCA resource adequacyUnder-resourced NCAs = enforcement riskMonitoring
12Update NCA mapping after MS designation changesCommission list updatesOngoing

See Also