2026-05-05·11 min read·

EDPB DPIA Template 2026: A SaaS Developer's Field Guide to GDPR Art.35 and AI Act Art.26(9)

The European Data Protection Board adopted its first standardised Data Protection Impact Assessment template (DPIA Template v1.0) on 10 March 2026 and published it on 14 April 2026. A public consultation runs until 9 June 2026.

For SaaS developers and product teams, this is important for two reasons. First, the template is likely to become the de-facto standard format that Data Protection Authorities expect — especially in cross-border cases handled by the EDPB's One-Stop-Shop mechanism. Second, the template explicitly includes a section addressing AI Act Article 26(9), meaning that any SaaS product using AI in a way that constitutes "high-risk AI system deployment" now needs a DPIA that covers both GDPR Article 35 and the AI Act simultaneously.

This guide walks through the template field by field, explains what each section actually requires, and flags where infrastructure jurisdiction changes your answers.


What Is a DPIA and When Do You Need One?

A Data Protection Impact Assessment is a structured risk analysis required under GDPR Article 35 before processing that is "likely to result in a high risk to the rights and freedoms of natural persons." It is not a one-time paperwork exercise — it must be maintained and updated when circumstances change.

You need a DPIA if your processing involves at least two of the nine high-risk criteria from EDPB Guidelines 9/2022:

  1. Evaluation or scoring (including creditworthiness, health, performance, profiling)
  2. Automated decision-making with legal or similarly significant effects (Art.22)
  3. Systematic monitoring of publicly accessible areas
  4. Sensitive data or data of a highly personal nature (Art.9, Art.10 categories)
  5. Data processed at scale
  6. Matching or combining datasets from multiple sources
  7. Data concerning vulnerable subjects (employees, children, patients)
  8. Innovative technology or novel application of existing technology
  9. Data transfers outside the EU/EEA with adequacy or appropriate safeguards questions

For most SaaS products with personalisation, analytics, or AI features, criteria 1, 5, 6, and 8 are almost always met. If your product uses AI features that qualify as high-risk under the EU AI Act Annex III, you add a mandatory AI Act Article 26(9) component — and the new EDPB template handles both in a single document.


The EDPB Template Structure (v1.0)

The template has six main sections. Here is what each requires in practice.

Section 1: Description of the Processing

What the template asks: Name of the processing activity, controller and processor identification, categories of data subjects, categories of personal data, retention periods, recipients, international transfers.

What developers often miss: This section requires you to identify all data processors, not just your primary cloud provider. If your SaaS stack includes third-party services (Stripe, Sendgrid, Datadog, Intercom, OpenAI API), each is a processor that must be listed with the legal basis for sub-processing.

Infrastructure jurisdiction impact: If you process EU personal data on infrastructure operated by a US-headquartered company (AWS, Azure, GCP, Cloudflare Workers, Vercel, Render, Railway with a US parent), you must document this in the international transfers field. Under GDPR Article 44, transfers to the US require either an adequacy decision, Standard Contractual Clauses (SCCs), or other appropriate safeguards. The CLOUD Act creates an additional complication: US cloud providers are subject to government access orders that do not require notification to you or your data subjects — a risk that the template's Section 5 (risks to rights and freedoms) requires you to assess.

Section 2: Necessity and Proportionality Assessment

What the template asks: Legal basis for processing (Art.6 for general data, Art.9 for sensitive categories), purpose limitation assessment, data minimisation measures, accuracy measures, retention justification, rights fulfilment mechanisms.

What developers often miss: "Legitimate interests" (Art.6(1)(f)) as legal basis requires a three-part balancing test documented here: the interest pursued, the necessity of the processing, and whether data subjects' interests override yours. For SaaS analytics and user behaviour tracking, this balancing test must be in the DPIA — not in a separate LIA document that you "attach."

AI features: If your product uses AI to personalise content, rank results, or flag user behaviour, the purpose limitation assessment must address whether training data derived from user interactions constitutes a compatible secondary purpose. The EDPB has repeatedly held that training proprietary models on user data without explicit consent is difficult to justify under Art.6(1)(f) alone.

Section 3: Risk Identification

What the template asks: Identification of risks to data subjects, likelihood and severity assessment for each risk, existing measures that mitigate each risk, residual risk after mitigation.

Risk taxonomy in the template: The v1.0 template uses a structured risk taxonomy with three categories:

The third category is new compared to previous EDPB DPIA guidance and reflects the AI Act's risk model. Developers building AI features need to assess inference risks — what can be inferred about data subjects beyond what they explicitly provided — as a separate risk category.

Section 4: AI Act Article 26(9) Section

What the template asks (new in v1.0): Whether the processing involves a high-risk AI system as defined in EU AI Act Annex III, deployer identification, fundamental rights impact assessment (FRIA) completion status, human oversight measures, record-keeping obligations under Art.12, accuracy and robustness measures under Art.9.

Who this applies to: AI Act Article 26(9) requires deployers of high-risk AI systems to carry out a Fundamental Rights Impact Assessment (FRIA) where no such assessment was made by the provider. For SaaS products using third-party AI APIs (OpenAI, Anthropic, Google Gemini, Mistral) to power features that fall under Annex III high-risk categories (biometric categorisation, employment decisions, credit scoring, educational assessment, law enforcement, migration-related processing), you are the deployer and this section applies to you.

Annex III high-risk categories most relevant to SaaS:

If none of your AI features fall into these categories, the template allows you to check "not applicable" and document the reasoning.

The GDPR + AI Act intersection: EDPB DPIA Template v1.0 explicitly connects the DPIA to the FRIA. If your GDPR DPIA identifies high privacy risks related to automated decision-making (Art.22 GDPR), and those decisions are made by a high-risk AI system, you cannot treat the FRIA as a separate document — the template requires them to be jointly documented or cross-referenced.

Section 5: Measures to Address Risks

What the template asks: Technical measures, organisational measures, contractual measures with processors, post-implementation verification timeline.

What "measures" actually requires: The template follows GDPR Recital 90 — measures must be "sufficient to demonstrate that the processing complies with this Regulation." Generic measures ("we encrypt data at rest") are insufficient. The template expects specific controls with verification:

Where infrastructure jurisdiction changes Section 5: If your data is processed on EU-native infrastructure (no US parent, no CLOUD Act exposure), the transfer impact assessment burden is significantly lower. You can document that no SCCs are required because all processing occurs within the EEA under a processor established under EU/EEA law. On US-parent platforms, the transfer impact assessment must address CLOUD Act exposure, FISA Section 702, and the degree to which the platform's Governmental Access Policy adequately protects data subjects.

Section 6: Consultation and Sign-Off

What the template asks: DPO consultation date (mandatory if Art.37 DPO required), controller sign-off, planned review date, outcome (proceed / proceed with additional measures / do not proceed).

The DPO consultation requirement: GDPR Article 36 requires you to consult your DPA before processing if the DPIA indicates that processing would result in a high residual risk and no measures can adequately mitigate it. The template includes a formal field for this. In practice, most DPIAs can be completed with "proceed with additional measures" — a DPA consultation is required only in the most severe cases.


Completing the DPIA Template: Practical Checklist for SaaS Teams

Before you open the template, gather:

Common errors that trigger DPA requests for revision:

  1. Retention periods listed as "as long as necessary" — specify actual durations or the criteria used to determine them
  2. Processors listed without SCC documentation — name the SCC module (Module 2: controller-to-processor) and the date it was signed
  3. Legitimate interests without a balancing test — document the three-part test explicitly
  4. AI risks described as "the provider is responsible" — deployer obligations under Art.26(9) do not transfer to the provider; you must document your own human oversight measures

How Infrastructure Jurisdiction Affects Your DPIA

The DPIA template's Section 3 and Section 5 both require transfer risk assessment when personal data leaves the EEA. The practical difference between EU-native infrastructure and US-parent-hosted infrastructure is significant.

On EU-native PaaS (e.g., sota.io, Scalingo, Clever Cloud):

On US-parent PaaS (e.g., Railway, Render, Vercel, Fly.io):

For SaaS products in regulated industries (healthcare, finance, legal services, HR tech), DPAs increasingly scrutinise the transfer impact assessment section. A complete TIA with CLOUD Act analysis on US infrastructure typically requires legal review. On EU-native infrastructure, this section collapses to a brief confirmation.


The Consultation Window: Why Now Matters

The EDPB's public consultation on DPIA Template v1.0 runs until 9 June 2026. Submitting feedback gives organisations the opportunity to:

  1. Request clarification on the AI Act Art.26(9) FRIA integration
  2. Raise concerns about the template's applicability to specific processing contexts (research, HR, financial services)
  3. Request clearer guidance on when DPA consultation under Art.36 is required

After the consultation closes, the EDPB is expected to publish a final version that may include changes. DPIAs completed using v1.0 will likely remain valid unless the final version introduces material changes to required fields.

Practical recommendation: Begin DPIA completion using v1.0 now. Document that you used v1.0 and date your completion. If the final version changes a field materially, update that section. Do not delay DPIA completion waiting for the final version — regulatory exposure begins with the processing activity, not with the document.


Summary

SectionKey requirementCommon error
1. DescriptionAll processors listed with SCC basisMissing sub-processors
2. NecessityLI balancing test documentedGeneric "legitimate interests"
3. RisksAI inference risks as separate categoryTreating all risks as technical
4. AI Act Art.26(9)FRIA cross-referenced if Annex IIIAssuming provider handles it
5. MeasuresSpecific controls with verification datesGeneric encryption statements
6. Sign-offDPO consultation date if requiredMissing review date

The EDPB DPIA Template v1.0 is the closest thing to a standardised GDPR DPIA format that the EU now has. For SaaS developers building products that process personal data at scale or incorporate AI features, completing it accurately is both a compliance requirement and a competitive signal — customers in regulated industries increasingly require a complete DPIA as part of vendor due diligence.

EU-Native Hosting

Ready to move to EU-sovereign infrastructure?

sota.io is a German-hosted PaaS — no CLOUD Act exposure, no US jurisdiction, full GDPR compliance by design. Deploy your first app in minutes.