2026-04-22·15 min read·

EU AI Act Art.27 Requirements Relating to Notified Bodies: Independence, Technical Competence, Quality Management, and the Notified Body Designation Framework (2026)

Article 27 of the EU AI Act opens the notified body framework. It defines what a conformity assessment body must be, how it must be structured, and what competencies, procedures, and guarantees it must demonstrate before any Member State authority can designate it as a notified body empowered to assess the conformity of high-risk AI systems.

Notified bodies occupy a structurally critical position in the EU AI Act compliance architecture. For high-risk AI systems in categories where third-party conformity assessment is mandatory — including AI systems intended to be used as safety components of products already covered by Union harmonisation legislation listed in Annex I — provider self-assessment is insufficient. The provider must engage a designated notified body, which independently examines technical documentation, tests the system, assesses the quality management system, and issues the certificate that enables CE marking and market placement.

The quality of notified body designations directly determines the credibility of the entire high-risk AI system conformity framework. Art.27 is the gatekeeping provision: it establishes the minimum threshold below which no body can be entrusted with this function.

The Position of Art.27 in the Notified Body Framework

The EU AI Act's notified body provisions span several articles, each handling a distinct phase of the notified body lifecycle:

Art.27 is exclusively about eligibility criteria. It answers the question: "What must a body be before it can be designated?" Art.31 answers the separate question: "What must a designated body do when it operates?" Both sets of requirements are permanent: a body that meets Art.27 requirements at the moment of designation must continue to meet them throughout its designation period.

The practical implication is that Art.27 compliance is not a one-time gate — it is an ongoing operating condition. National accreditation bodies (typically the same bodies that accredit ISO 17025 laboratories and ISO 17065 product certification bodies) perform periodic surveillance assessments to verify continued compliance.

The Conformity Assessment Context: When is a Notified Body Required?

Not all high-risk AI systems require third-party conformity assessment. The EU AI Act creates two tracks:

Track 1 — Internal control (Annex VII): Providers may self-assess conformity for most high-risk AI systems listed in Annex III. The provider generates and signs its technical documentation and declaration of conformity without external involvement. No notified body is required.

Track 2 — Third-party assessment (Annex VIII): Providers of high-risk AI systems that are safety components of products regulated under harmonisation legislation listed in Annex I — including medical devices, machinery, automotive systems, civil aviation, and rail systems — must involve a notified body. The conformity assessment for the AI component follows either the procedure based on quality management system assessment (Annex VIII, Module D) or the procedure based on technical documentation assessment (Annex VIII, Module H), depending on the applicable sectoral legislation.

Additionally, providers who voluntarily choose third-party assessment for Annex III systems must use a notified body that meets Art.27 requirements.

Art.27 requirements therefore apply to a body being notified under any of these tracks. The requirements are uniform: there is no reduced standard for bodies covering narrower scopes.

Notified bodies must be established under national law as legal entities. They must have legal personality. This is a threshold requirement: informal associations, unincorporated groups, or divisions of government agencies without independent legal personality cannot be notified bodies.

Beyond corporate form, bodies must be able to demonstrate their technical competence through national accreditation. The preferred route under the EU AI Act is accreditation by the national accreditation body designated under Regulation (EC) No 765/2008 — the same bodies that accredit conformity assessment bodies under the New Legislative Framework (NLF) for product legislation.

National accreditation against a relevant harmonised standard (EN ISO/IEC 17065 for product certification bodies, EN ISO/IEC 17020 for inspection bodies) provides a presumption of conformity with the Art.27 requirements. Accreditation is not legally mandatory if the applying body can demonstrate competence through other means, but accreditation substantially simplifies the designation process and provides ongoing external validation.

The accreditation scope must match the scope of the notification. A body accredited for medical device conformity assessment cannot be notified to assess AI systems in the biometric identification category without a scope extension covering AI-specific technical competence.

Art.27(2): Independence from Interested Parties

The independence requirement is the most operationally demanding aspect of Art.27. A notified body must be a genuine third party: it must not be the developer, manufacturer, supplier, provider, owner, user, importer, or distributor of the high-risk AI system it is assessing, nor have any organisational or financial link to any of these parties that could compromise impartiality.

The independence requirement operates at multiple levels:

Organisational independence: The notified body must not be under the control of any party with a financial interest in the outcome of the assessment. Subsidiary relationships, shared board members, and majority shareholding by a party in the AI supply chain are disqualifying. Bodies that are subsidiaries of system integrators, cloud providers, or AI platform companies face structural independence challenges.

Financial independence: The body's revenue must not depend in a way that creates incentives to reach particular conformity assessment conclusions. A body that earns 80% of its revenue from a single AI provider, or whose fee structure rewards rapid positive assessments, does not meet the financial independence standard.

Personal independence: Management and assessment staff must not hold financial interests, current employment, or advisory roles at companies in the supply chain of systems they assess. Independence requirements typically extend to a look-back period (often two to five years) to prevent revolving-door conflicts.

Procedural independence: The body's procedures must be structured to prevent interested parties from influencing assessment conclusions. Internal review, escalation paths, and sign-off authorities must not route through personnel with supply-chain connections.

Documented independence management processes are essential. Bodies must maintain registers of potential conflicts, require disclosures from staff before assessment assignments, and have formal procedures for recusal when conflicts are identified. These registers and procedures are reviewed by national accreditation bodies as part of surveillance.

Art.27(3): Impartiality Obligations

Impartiality is distinct from independence. Independence is structural (absence of organisational or financial links to interested parties). Impartiality is behavioural (consistent, bias-free application of assessment criteria regardless of who is being assessed).

The EU AI Act requires that notified bodies, their senior management, and the staff responsible for carrying out conformity assessment activities demonstrate impartiality. This means:

Impartiality management includes training programmes for assessment staff, documented decision records that support later review, and mechanisms for clients to challenge assessment conclusions without penalty.

Art.27(4): Technical Competence and Personnel Qualifications

Notified bodies must have the technical competence to perform the conformity assessments within their designated scope. For AI systems, this is a multidimensional requirement covering:

AI domain expertise: Staff must understand the type of AI systems being assessed — their architecture, training methodology, data requirements, and known failure modes. A body notified to assess AI medical diagnostic systems must have staff with expertise in medical imaging, clinical decision support, and AI model validation for healthcare settings, not just general software testing competence.

Regulatory knowledge: Assessment staff must have thorough knowledge of the EU AI Act and its delegated acts, the harmonised standards applicable to the AI system (once published by CENELEC/CEN/ETSI), and the sectoral legislation relevant to the product in which the AI system functions.

Testing and audit methodology: Practical ability to conduct technical documentation review, quality management system audits, and system-level testing. For AI systems, this includes understanding of training/validation/test data set governance, performance metric interpretation, bias detection methods, and robustness testing frameworks.

Multidisciplinary coverage: AI systems often span multiple technical domains. A body may need expertise covering software engineering, statistical methods, domain-specific performance standards, and cybersecurity. These capabilities may be held internally or supplemented through subcontracting, but the body retains responsibility for subcontracted work quality (Art.27(9)).

Personnel qualification requirements must be documented. Bodies should maintain competency frameworks mapping required knowledge and skills for each assessment scope category, records showing how individual staff meet those requirements, and continuing development plans to keep pace with AI system evolution.

Art.27(5): Financial Stability

Notified bodies must have the financial resources necessary to perform their conformity assessment activities. This covers:

Operational resources: Sufficient budget for staffing, test infrastructure, laboratory facilities, and ongoing training to maintain technical competence at the level the scope demands.

Independence protection: Financial stability that prevents dependence on any single client or small group of clients in a way that would compromise independent operation. A body facing insolvency if it loses a single major contract is structurally vulnerable to commercial pressure on assessment conclusions.

Continuity of function: Ability to complete assessments started if there is staff turnover or operational disruption. Bodies should have succession plans and financial reserves to absorb unexpected costs — including the cost of disputes, appeals, or regulatory investigations.

National accreditation bodies assess financial stability as part of accreditation surveillance. Evidence typically includes audited financial statements, information about revenue diversification, and confirmation of the body's ability to carry professional liability insurance at required coverage levels.

Art.27(6): Professional Liability Insurance

Notified bodies must hold professional liability insurance adequate for their activities unless liability is covered by the Member State under national law.

The insurance requirement addresses the risk that a notified body error — issuing a certificate for a non-compliant system, or failing to detect a critical defect — results in harm to third parties. In high-risk AI contexts (medical diagnosis, credit scoring, employment decisions, law enforcement support), system failures can cause significant financial or physical harm.

Insurance coverage must be:

Bodies operating across multiple Member States should verify that their insurance covers liability arising from assessments conducted in or for other EU jurisdictions, not just their home Member State.

Art.27(7): Quality Management System

Notified bodies must have a documented and implemented quality management system covering all conformity assessment activities within their scope. The QMS is the operational backbone of the notified body function — it defines how the body actually performs assessments, controls documentation, manages nonconformities, handles complaints, and improves processes.

Key QMS elements for notified bodies assessing AI systems include:

Process documentation: Written procedures for every phase of conformity assessment — application intake, resource allocation, technical documentation review, quality management system audit, testing, decision, certification issuance, surveillance, and certificate withdrawal. Procedures must be specific enough that different assessors applying the same procedure to the same system would reach equivalent conclusions.

Document and record control: Systematic management of assessment records — what was reviewed, what criteria were applied, what conclusions were reached, and the evidentiary basis for each conclusion. Records must be retained for a defined period (typically ten years) to support market surveillance inspections and to enable reconstruction of assessment rationale if a certified system is later found non-compliant.

Competency management: Integration of personnel qualification requirements into QMS processes — tracking who is authorised to perform what types of assessment, managing training records, and controlling assignment of assessment tasks.

Internal audit: Periodic self-assessment of QMS effectiveness, identifying gaps between documented procedures and actual practice, and driving corrective actions.

Management review: Regular review by senior management of QMS performance metrics, complaints, audit findings, and market surveillance feedback, driving continuous improvement.

Nonconformity handling: Formal processes for identifying, documenting, and resolving instances where the body's own processes do not meet requirements, or where assessment conclusions are challenged.

EN ISO/IEC 17065 (product certification) and EN ISO/IEC 17020 (inspection) provide detailed requirements for QMS in conformity assessment bodies. Bodies accredited against these standards already have QMS frameworks compatible with Art.27 requirements.

Art.27(8): Confidentiality Obligations

Notified bodies and their staff must maintain confidentiality regarding all information obtained in the course of conformity assessment activities. This obligation persists after the assessment is complete and after employment ends.

Confidentiality protections cover:

Provider technical information: Design specifications, architecture, training data descriptions, validation results, and other technical documentation submitted as part of conformity assessment is commercially sensitive. Bodies must have information security controls preventing unauthorised access, transmission, or disclosure.

Assessment process information: The body's internal deliberations, draft assessment reports, and preliminary conclusions are confidential. Providers should not receive advance disclosure of negative findings in a way that allows them to influence assessment conclusions before the formal report is issued.

Certification decisions: While the existence of a certificate is typically public (and for high-risk AI systems must be registered in EU databases), the underlying technical rationale and any confidential commercial details within the technical documentation are not.

Limits to confidentiality: Notified bodies must cooperate with national authorities and the Commission on market surveillance and enforcement matters. Confidentiality obligations do not extend to withholding information from competent authorities performing their regulatory functions.

Bodies must document their information security procedures and demonstrate staff training on confidentiality obligations as part of accreditation assessment.

Art.27(9): Subcontracting and Use of External Experts

Notified bodies may subcontract specific conformity assessment tasks or use external experts, subject to restrictions designed to preserve independence and quality accountability.

Permitted subcontracting: Technical testing tasks, specialist domain expertise reviews, or inspection activities may be subcontracted to third parties when the notified body does not have sufficient in-house capability. The body retains full responsibility for the quality and correctness of subcontracted work.

Prohibited subcontracting: Assessment decisions — the final certification conclusion — cannot be subcontracted. The notified body's assessment decision must be made by the body's own authorised personnel, not delegated to a subcontractor.

Independence requirements extend to subcontractors: Subcontractors must meet the same independence and impartiality requirements as the notified body itself. A notified body cannot achieve independence by subcontracting to a party that has the conflicts the body avoids.

Subcontractor management: Bodies must maintain registers of approved subcontractors, verify their qualifications and independence, include subcontractor work in the body's QMS control, and ensure subcontractors are contractually bound to confidentiality and quality obligations equivalent to those applying to the body's own staff.

Client notification: In some procedures, the body must inform the applying provider of the intent to subcontract specific tasks, giving the provider an opportunity to object if the proposed subcontractor has a conflict.

Art.27(10): Designated Scope and Limitations

Notified bodies are designated for specific product categories and conformity assessment procedures, not as unlimited AI conformity assessors. The scope of designation is defined in the notification submitted by the Member State authority to the Commission and published in the NANDO (New Approach Notified and Designated Organisations) database.

Scope boundary obligation: A body may only perform notified body functions within its designated scope. A body designated for AI systems used in medical devices under the MDR cannot issue conformity assessment certificates for AI systems used in employment screening under Annex III unless it has been separately notified for that scope.

Scope extension procedure: Adding new product categories or assessment procedures to an existing designation requires a formal scope extension, which typically triggers a new accreditation assessment covering the added scope and a new national notification procedure.

Scope restriction in case of partial non-compliance: If a body no longer meets Art.27 requirements in respect of part of its scope — for example, it loses key staff with specific AI domain expertise — the national authority may restrict the body's designation to the scope where it continues to meet requirements, rather than withdrawing the designation entirely. This is addressed in Art.29.

Limitations on competition: The scope of designation also defines what the body may advertise as notified body services. Bodies sometimes offer conformity assessment services that do not involve the notified body role (e.g., voluntary testing or certification under private standards). These services must be clearly distinguished from notified body activities and must not rely on the authority of the notification to market services that are not covered by it.

Notified Body Designation Assessment Matrix

The following matrix summarises the key Art.27 requirement areas and the evidence typically required by national authorities during designation assessment:

Requirement AreaKey EvidenceAssessment Focus
Legal establishmentCertificate of incorporation, organisational chartCorporate form, legal personality
National accreditationAccreditation certificate + scopeStandard alignment (EN ISO/IEC 17065 or 17020)
IndependenceRegister of interests, shareholding structure, revenue diversificationAbsence of supply-chain links
ImpartialityImpartiality policy, conflict management procedures, staff declarationsEqual treatment, bias prevention
Technical competenceCV database, competency framework, training recordsCoverage of AI + regulatory + domain expertise
Financial stabilityAudited accounts, insurance certificate, revenue analysisAbility to operate independently
Quality management systemQMS manual, procedures, internal audit recordsProcess control, record retention
ConfidentialityInformation security policy, NDAs, staff training recordsInformation protection controls
Complaints handlingComplaints log, appeal procedure, resolution recordsEffective challenge mechanism
SubcontractingApproved subcontractor register, subcontract agreementsIndependence extension, quality accountability

Python: Notified Body Eligibility Self-Assessment

The following Python class models the Art.27 requirement assessment process. It can be used by conformity assessment bodies considering a notified body designation to identify gaps before submitting a formal application.

from dataclasses import dataclass, field
from enum import Enum
from typing import List, Optional


class EligibilityStatus(Enum):
    MEETS_REQUIREMENT = "meets_requirement"
    PARTIAL = "partial_compliance"
    GAP_IDENTIFIED = "gap_identified"
    NOT_ASSESSED = "not_assessed"


@dataclass
class RequirementAssessment:
    article_ref: str
    requirement: str
    status: EligibilityStatus
    evidence: str
    gap_description: Optional[str] = None
    remediation_action: Optional[str] = None


@dataclass
class NotifiedBodyEligibilityAssessment:
    body_name: str
    proposed_scope: str
    assessment_date: str
    requirements: List[RequirementAssessment] = field(default_factory=list)

    def add_requirement(self, assessment: RequirementAssessment) -> None:
        self.requirements.append(assessment)

    def gaps(self) -> List[RequirementAssessment]:
        return [
            r for r in self.requirements
            if r.status == EligibilityStatus.GAP_IDENTIFIED
        ]

    def partial_compliance(self) -> List[RequirementAssessment]:
        return [
            r for r in self.requirements
            if r.status == EligibilityStatus.PARTIAL
        ]

    def eligibility_summary(self) -> dict:
        total = len(self.requirements)
        meets = sum(
            1 for r in self.requirements
            if r.status == EligibilityStatus.MEETS_REQUIREMENT
        )
        gaps = len(self.gaps())
        partial = len(self.partial_compliance())

        eligible = gaps == 0
        return {
            "body": self.body_name,
            "scope": self.proposed_scope,
            "eligible_for_designation": eligible,
            "requirements_met": f"{meets}/{total}",
            "gaps": gaps,
            "partial_compliance_items": partial,
            "recommendation": (
                "Eligible: proceed to national notification application"
                if eligible
                else f"Not eligible: {gaps} gap(s) require remediation before application"
            ),
        }

    def remediation_plan(self) -> List[dict]:
        return [
            {
                "article": r.article_ref,
                "requirement": r.requirement,
                "gap": r.gap_description,
                "action": r.remediation_action,
            }
            for r in self.requirements
            if r.status in (EligibilityStatus.GAP_IDENTIFIED, EligibilityStatus.PARTIAL)
            and r.remediation_action
        ]


def assess_notified_body_eligibility(
    body_name: str,
    has_legal_personality: bool,
    accreditation_scope_matches: bool,
    no_supply_chain_links: bool,
    impartiality_procedures_documented: bool,
    has_ai_domain_experts: bool,
    has_regulatory_knowledge: bool,
    financial_stability_demonstrated: bool,
    liability_insurance_in_place: bool,
    qms_documented_and_implemented: bool,
    confidentiality_controls_in_place: bool,
    complaints_procedure_documented: bool,
    subcontracting_controls_in_place: bool,
    proposed_scope: str = "High-risk AI systems (Annex III)",
    assessment_date: str = "2026-04-22",
) -> NotifiedBodyEligibilityAssessment:

    assessment = NotifiedBodyEligibilityAssessment(
        body_name=body_name,
        proposed_scope=proposed_scope,
        assessment_date=assessment_date,
    )

    def status(flag: bool) -> EligibilityStatus:
        return (
            EligibilityStatus.MEETS_REQUIREMENT
            if flag
            else EligibilityStatus.GAP_IDENTIFIED
        )

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27(1)",
        requirement="Legal establishment and legal personality",
        status=status(has_legal_personality),
        evidence="Certificate of incorporation reviewed",
        gap_description=None if has_legal_personality else "Body lacks independent legal personality",
        remediation_action=None if has_legal_personality else "Incorporate as independent legal entity before application",
    ))

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27(1)",
        requirement="National accreditation scope matches proposed designation scope",
        status=status(accreditation_scope_matches),
        evidence="Accreditation certificate scope reviewed",
        gap_description=None if accreditation_scope_matches else "Accreditation scope does not cover all proposed AI system categories",
        remediation_action=None if accreditation_scope_matches else "Apply for accreditation scope extension covering AI-specific categories",
    ))

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27(2)",
        requirement="Independence from supply chain parties",
        status=status(no_supply_chain_links),
        evidence="Independence register and shareholding structure reviewed",
        gap_description=None if no_supply_chain_links else "Identified organisational or financial links to AI system supply chain parties",
        remediation_action=None if no_supply_chain_links else "Divest conflicting interests; establish Chinese walls; apply conflict management procedures",
    ))

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27(3)",
        requirement="Impartiality procedures for management and staff",
        status=status(impartiality_procedures_documented),
        evidence="Impartiality policy and conflict declaration process reviewed",
        gap_description=None if impartiality_procedures_documented else "No documented impartiality management procedures",
        remediation_action=None if impartiality_procedures_documented else "Develop impartiality policy, conflict register, and recusal procedure",
    ))

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27(4)",
        requirement="Technical AI domain expertise",
        status=status(has_ai_domain_experts),
        evidence="Staff competency matrix and CVs reviewed",
        gap_description=None if has_ai_domain_experts else "Insufficient AI-specific technical expertise for proposed scope",
        remediation_action=None if has_ai_domain_experts else "Recruit or contract AI domain experts; establish training programme",
    ))

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27(4)",
        requirement="EU AI Act regulatory knowledge",
        status=status(has_regulatory_knowledge),
        evidence="Training records and regulatory knowledge assessments reviewed",
        gap_description=None if has_regulatory_knowledge else "Staff lack adequate knowledge of EU AI Act requirements",
        remediation_action=None if has_regulatory_knowledge else "Deliver EU AI Act regulatory training programme to all assessment staff",
    ))

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27(5)",
        requirement="Financial stability",
        status=status(financial_stability_demonstrated),
        evidence="Audited financial statements and revenue diversification analysis reviewed",
        gap_description=None if financial_stability_demonstrated else "Revenue concentration or insufficient reserves indicate financial vulnerability",
        remediation_action=None if financial_stability_demonstrated else "Diversify revenue base; establish financial reserves; provide three-year financial projections",
    ))

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27(6)",
        requirement="Professional liability insurance",
        status=status(liability_insurance_in_place),
        evidence="Insurance certificate reviewed",
        gap_description=None if liability_insurance_in_place else "No professional liability insurance or coverage inadequate for AI system scope",
        remediation_action=None if liability_insurance_in_place else "Obtain professional liability insurance appropriate for AI system assessment activities",
    ))

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27(7)",
        requirement="Quality management system documented and implemented",
        status=status(qms_documented_and_implemented),
        evidence="QMS manual and procedure documentation reviewed; internal audit records examined",
        gap_description=None if qms_documented_and_implemented else "QMS not documented or not effectively implemented",
        remediation_action=None if qms_documented_and_implemented else "Develop QMS aligned to EN ISO/IEC 17065; conduct pre-accreditation internal audit",
    ))

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27(8)",
        requirement="Confidentiality controls for assessment information",
        status=status(confidentiality_controls_in_place),
        evidence="Information security policy and staff training records reviewed",
        gap_description=None if confidentiality_controls_in_place else "Insufficient controls for protecting confidential provider technical information",
        remediation_action=None if confidentiality_controls_in_place else "Implement information security management aligned to ISO/IEC 27001; deploy NDAs and access controls",
    ))

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27",
        requirement="Complaints and appeals procedure",
        status=status(complaints_procedure_documented),
        evidence="Complaints log and appeal procedure documentation reviewed",
        gap_description=None if complaints_procedure_documented else "No documented procedure for handling assessment complaints and appeals",
        remediation_action=None if complaints_procedure_documented else "Develop complaints and appeals procedure; establish tracking log",
    ))

    assessment.add_requirement(RequirementAssessment(
        article_ref="Art.27(9)",
        requirement="Subcontracting controls",
        status=status(subcontracting_controls_in_place),
        evidence="Approved subcontractor register and subcontract agreements reviewed",
        gap_description=None if subcontracting_controls_in_place else "No controls for managing subcontractor independence and quality",
        remediation_action=None if subcontracting_controls_in_place else "Develop subcontractor approval procedure; extend independence checks to all subcontractors",
    ))

    return assessment


# Example usage — assessment for a body seeking designation in AI medical devices
assessment = assess_notified_body_eligibility(
    body_name="ExampleCAB GmbH",
    has_legal_personality=True,
    accreditation_scope_matches=False,  # Gap: accreditation needs scope extension
    no_supply_chain_links=True,
    impartiality_procedures_documented=True,
    has_ai_domain_experts=False,  # Gap: need AI/ML domain experts for medical AI
    has_regulatory_knowledge=False,  # Gap: EU AI Act training not yet completed
    financial_stability_demonstrated=True,
    liability_insurance_in_place=True,
    qms_documented_and_implemented=True,
    confidentiality_controls_in_place=True,
    complaints_procedure_documented=True,
    subcontracting_controls_in_place=True,
    proposed_scope="High-risk AI systems — AI components in medical devices (MDR Annex I + EU AI Act Annex I section A)",
)

summary = assessment.eligibility_summary()
# summary["eligible_for_designation"] => False (3 gaps)
# summary["recommendation"] => "Not eligible: 3 gap(s) require remediation before application"

plan = assessment.remediation_plan()
# Returns list of {article, requirement, gap, action} for each gap

Art.27 × Art.28 Integration: From Requirements to Notification

Meeting Art.27 requirements is necessary but not sufficient for becoming a notified body. The formal designation requires a national notification procedure under Art.28:

  1. Application to national authority: The conformity assessment body applies to the designated notifying authority in its Member State, demonstrating Art.27 compliance through evidence including accreditation certificate, QMS documentation, personnel records, and financial statements.

  2. National assessment: The notifying authority assesses the application against Art.27 criteria. Where the body holds national accreditation from the designated national accreditation body, this assessment is simplified — accreditation provides a presumption of conformity.

  3. Notification to Commission: Once the national authority is satisfied, it notifies the Commission via the NANDO database, providing the body's name, registration number, scope, assessment procedures, and contact details.

  4. Notification period: After notification, there is a period during which other Member States and the Commission may object. If no substantiated objection is raised, the body may begin operating as a notified body within its designated scope.

  5. Ongoing surveillance: The national authority continues to oversee the designated body, investigating complaints and ensuring continued Art.27 compliance. Art.29 governs changes, restrictions, and withdrawal.

The Art.27 requirements do not change after designation — the body must continue to meet them. National authorities may conduct periodic reviews and must investigate credible complaints about a notified body's independence, competence, or quality of assessments.

Art.27 Compliance Checklist

Independence and Impartiality

Technical Competence

Financial and Insurance

Quality Management System

Confidentiality and Information Security

Complaints and Subcontracting

Summary

Article 27 establishes the threshold requirements for conformity assessment bodies seeking designation as notified bodies for high-risk AI systems. Its requirements span corporate structure, independence, impartiality, technical competence, financial stability, insurance, quality management, confidentiality, complaints handling, and subcontracting controls. Meeting these requirements is an ongoing obligation: a body that meets them at the time of designation must continue to meet them throughout the designation period.

The most demanding Art.27 requirements in the AI context are the technical competence requirements (which require genuine AI domain expertise beyond general product certification skills), the independence requirements (which present structural challenges for bodies that are subsidiaries or affiliates of companies in the AI supply chain), and the QMS requirements (which must be specific enough to produce consistent conformity assessment conclusions across different assessors and over time).

Art.27 connects forward to Art.28 (the national notification procedure), Art.29 (changes and withdrawal of designations), Art.30 (challenge of competence), and Art.31 (operational obligations of designated bodies). Conformity assessment bodies beginning the notified body designation process should use Art.27 as the gap analysis framework before engaging with national accreditation and notification authorities, and should expect the designation process — from initial gap analysis to completed notification — to take 12-24 months depending on existing accreditation scope and national authority workload.

See Also