GDPR Enforcement 2024–2026: What the Billion-Euro Fines Tell Every SaaS Developer
Post #867 in the sota.io EU Cyber Compliance Series
The era of symbolic GDPR fines is over. Between 2023 and 2026, European Data Protection Authorities issued fines totalling well over €3 billion, with individual decisions reaching nine figures. More important than the amounts is the pattern: each major enforcement action traces back to a specific technical architecture decision — where data is stored, which lawful basis is invoked, how consent is collected, whether processors are properly contracted, and whether breach notifications go out on time.
For SaaS developers, this enforcement record is a detailed map of exactly what to fix. This guide works through the five highest-impact cases, extracts the technical failure in each, and translates it into concrete implementation requirements.
Why Enforcement Accelerated After 2022
GDPR came into force in May 2018. For the first three years, fines were modest and enforcement was fragmented. That changed for several structural reasons:
The Irish DPC pipeline cleared. Most large US tech companies route EU operations through Ireland, placing them under the Irish DPC. Between 2018 and 2022, the EDPB repeatedly criticized the Irish DPC for slow processing. Under sustained pressure, the DPC cleared a backlog of major cases simultaneously in 2023–2024, producing a cascade of nine-figure decisions.
The Schrems II fallout hit. The Court of Justice's July 2020 Schrems II ruling invalidated the EU-US Privacy Shield and imposed a transfer impact assessment (TIA) requirement on Standard Contractual Clauses. Companies that moved data to the US without completing valid TIAs found themselves in direct violation — and DPAs had four years of accumulating exposure to work with.
EDPB coordinated enforcement began. Under Article 60, the EDPB can coordinate cross-border cases and override lenient national decisions. This mechanism started being used aggressively after 2022, meaning a DPA that wanted to issue a small fine could be overruled by the full Board.
AI training data created new exposure. As LLM providers disclosed that they were training on European user data, DPAs opened investigations under Art.6 (lawful basis), Art.13/14 (transparency), Art.22 (automated decision-making), and Art.25 (data protection by design). This opened a new front entirely for companies building AI-enabled SaaS.
Case 1: Meta — €1.2 Billion (Irish DPC, May 2023)
The fine: €1.2 billion — the largest GDPR fine ever issued. Irish DPC, affirmed by EDPB.
The violation: Meta transferred European Facebook user data to US servers under Standard Contractual Clauses without completing an adequate Transfer Impact Assessment. The CJEU's Schrems II ruling required companies using SCCs to verify that US law — specifically the CLOUD Act, FISA Section 702, and Executive Order 12333 — did not undermine the protections the SCCs promise. Meta did not do this credibly. The EDPB ruled the SCCs were being used to launder a transfer that should not happen at all, given that US intelligence law provides no equivalent protection to GDPR Article 46.
The technical failure: Data was processed on US infrastructure (Meta's US servers) for EU users without a valid transfer mechanism post-Schrems II. The SCCs existed on paper but were not backed by a TIA showing that US surveillance law would not override them.
What this means for SaaS developers:
Every US cloud provider is exposed to the same structural problem Meta had. When you use AWS US East, Azure US regions, or GCP US regions to process EU personal data, you inherit this transfer risk. The providers publish SCCs — but the Schrems II requirement is that YOU, as the data controller, must conduct a TIA confirming those SCCs are adequate given US law. The fact that your provider offers SCCs does not discharge your obligation to assess whether those SCCs are actually enforceable in the US surveillance context.
Fix: Process EU personal data in EU regions of cloud providers, and document your TIA for any residual transfers. Or remove the transfer risk entirely by using EU-incorporated providers not subject to US cloud law jurisdiction. EU-native infrastructure eliminates the TIA obligation because no Chapter V transfer is occurring.
Case 2: TikTok — €345 Million (Irish DPC, September 2023)
The fine: €345 million. Irish DPC.
The violations:
- Article 8 (children's consent): Minors were given public-by-default account settings with no age verification. This violated the requirement that processing children's data requires a higher standard of consent and that privacy-protective defaults must apply.
- Article 25 (data protection by design and by default): The platform's default settings maximized data exposure rather than minimizing it. "Privacy by default" means the most privacy-protective option must be the default — not the least.
- Article 13 (transparency): The information provided to users about how their data was used was insufficiently clear.
The technical failure: Default settings were configured for engagement maximization, not data protection minimization. The application did not verify user age before applying adult-default settings. The privacy notice was structured to obscure rather than disclose.
What this means for SaaS developers:
Article 25 has direct technical implications. When you build your application, every configuration that affects personal data must default to the most restrictive option:
| Setting | Non-Compliant Default | Compliant Default |
|---|---|---|
| User profile visibility | Public | Private |
| Marketing email opt-in | Pre-checked box | Unchecked, requires active selection |
| Analytics tracking | Enabled | Disabled until consent |
| Data retention | Indefinite | Minimum period documented |
| API access to user data | Open to third-party apps | Restricted until explicitly granted |
| Error logging | Log full request including PII | Log sanitized request |
Fix: Audit every default setting in your application. Where a setting determines what happens to personal data, the default must be the option that processes the least data. This is not a "best practice" — it is a binding obligation under Article 25(2).
Case 3: LinkedIn — €310 Million (Irish DPC, October 2024)
The fine: €310 million. Irish DPC.
The violation: LinkedIn processed the behavioral data of European users for targeted advertising under legitimate interests (Art.6(1)(f)) and consent (Art.6(1)(a)) — but both claims were found invalid. The DPC, upheld by the EDPB, ruled that:
- Legitimate interests was not a valid basis because the scale of processing, the lack of transparency, and the absence of a meaningful opt-out mechanism meant users' fundamental rights clearly overrode LinkedIn's commercial interest.
- The consent mechanism did not meet the Art.7 standard — it was not "freely given" because consent was bundled with service access, pre-ticked, and withdrawal was made difficult.
The technical failure: LinkedIn's advertising consent system was designed to obtain nominal compliance with the form of consent while evading the substance. The technical implementation of the consent UI (pre-ticked boxes, hard-to-find opt-out, consent withdrawal buried in settings) was itself the violation.
What this means for SaaS developers:
Legitimate interests is the most frequently misapplied GDPR lawful basis by SaaS companies. It requires a three-part balancing test:
- Purpose test: Is there a legitimate interest? (Commercial interest can qualify)
- Necessity test: Is the processing necessary for that purpose? (Is there a less intrusive way?)
- Balancing test: Do the individual's interests, rights, and freedoms override the legitimate interest?
The LinkedIn case establishes that large-scale behavioral profiling for advertising fails the balancing test even if the other two steps pass. The individual's reasonable expectation of privacy when using a professional network overrides the commercial interest in monetizing behavioral data at scale.
For SaaS developers, the practical consequence is:
- Use legitimate interests for: server security logging, fraud detection, internal analytics that users would reasonably expect, direct marketing to existing customers about similar products (with opt-out).
- Do not use legitimate interests for: building behavioral profiles, third-party data sharing, targeted advertising, training ML models on user behavior, data enrichment from external sources.
Fix: Audit every lawful basis claim in your Privacy Notice. For each processing activity, document the three-part test. If you cannot pass all three steps for legitimate interests, switch to a valid basis — most often, explicit consent with a proper opt-in mechanism.
Case 4: Uber — €290 Million (Dutch DPA, August 2024)
The fine: €290 million. Dutch Autoriteit Persoonsgegevens.
The violation: Uber transferred European driver data to US servers without a valid transfer mechanism. The Privacy Shield had been invalidated in 2020. Uber used a stop-gap process of transferring data under Article 49(1)(b) (transfer necessary for performance of a contract) — but the Dutch DPA ruled this was a structural, systematic transfer that cannot rely on the Article 49 derogations, which are intended for occasional and necessary transfers only.
The technical failure: Uber continued routing EU driver data to US infrastructure after Privacy Shield invalidation without implementing SCCs with TIAs, instead relying on an Article 49 derogation that is explicitly not designed for systematic operational data flows.
What this means for SaaS developers:
Article 49 derogations are emergency mechanisms, not operational compliance tools. The exhaustive list of situations where they apply includes:
- Explicit consent for a specific transfer (one-off, not systematic)
- Necessity for contract performance (must be genuinely necessary, not convenient)
- Necessity for legal claims
- Vital interests
- Compelling legitimate interests (narrow exception)
If you find yourself planning to use Article 49 to justify sending EU data to a US SaaS tool you use regularly — your HR system, your support platform, your CRM — that is not a valid use of Article 49. The regulators have made this explicit. Systematic data flows to US systems require either SCCs with TIAs or restructuring to EU infrastructure.
Fix: Audit your data flow map for all US-incorporated SaaS tools that process EU personal data. For each, either implement SCCs with a documented TIA, or migrate to EU-incorporated alternatives. Article 49 is not an ongoing compliance mechanism.
Case 5: Clearview AI — Multi-Jurisdiction (2021–2025)
The fines: Italy: €20M (2022). France: €20M (2022). Greece: €20M (2022). Netherlands: €30.5M (2023). UK: £7.5M (2023). Combined: ~€100M+.
The violation: Clearview AI scraped billions of facial images from public websites to build a biometric identification database without consent, without a valid lawful basis, and without meeting the Art.9 requirements for processing biometric data (which is a special category).
The technical failure: Processing biometric data (data uniquely identifying a natural person by physical or behavioral characteristics) requires:
- An Art.9(2) exception (explicit consent, vital interests, legal obligation, etc.) — no commercial purpose qualifies
- A valid Art.6 lawful basis independently
- An Art.35 DPIA (always required for biometric data at scale)
- Compliance with Art.22 if used for automated individual decisions
Clearview had none of these. The EDPB issued guidelines in 2023 stating that facial recognition for law enforcement purposes requires an explicit legal basis in Member State law — commercial use for identification is simply prohibited in the EU.
What this means for SaaS developers:
The Clearview cases establish the outer limit of what is permissible with biometric and special category data. But the lesson generalizes to any SaaS product that:
- Processes health data, genetic data, biometric data, religious or political beliefs
- Uses scraping or third-party data enrichment
- Trains ML models on user-generated content
- Processes data for purposes users did not reasonably expect
In each case, the Art.9(2) exception must be in place before any processing begins. "We have legitimate interests" does not qualify. "We disclosed it in the privacy policy" does not qualify. The exception must actually apply.
Fix: For any special category data (Art.9(1) list), document the specific Art.9(2) exception before processing begins. If no exception applies, the processing cannot happen regardless of other compliance measures.
Six Enforcement Patterns Every Developer Must Know
Across these cases and the broader 2024–2026 enforcement record, six patterns explain the vast majority of GDPR violations:
Pattern 1: US Infrastructure for EU Personal Data
The Meta, Uber, and CLOUD Act cases all reduce to the same problem: EU personal data on US infrastructure creates a structural transfer problem that cannot be resolved purely through contractual mechanisms when US surveillance law provides US agencies direct access to that data.
Developer action: Process EU personal data in EU infrastructure. Use EU-incorporated providers that are not subsidiaries of US companies when possible. If you use a US parent company's EU data center, document your TIA and understand that the risk is not zero.
Pattern 2: Defaults Set for Business Interest, Not Privacy
The TikTok case is the clearest example, but the pattern is pervasive. Applications default to:
- Marketing emails opt-in enabled
- Analytics tracking enabled
- Profile visibility set to public
- Data retention indefinite
Each of these is an Art.25 violation in the EU context.
Developer action: Reverse-audit defaults. Build a configuration inventory showing every setting that touches personal data and verify that the default is the minimum-necessary option.
Pattern 3: Lawful Basis Mismatch
The LinkedIn case shows the most common error: using legitimate interests as a catch-all for behavioral advertising and profiling when the balancing test fails.
Developer action: For each processing activity, document which Art.6 lawful basis applies and why. Map the lawful basis to the actual processing in your privacy notice. Treat legitimate interests documentation as something a DPA will examine — because they will.
Pattern 4: Inadequate Consent Mechanisms
Across cases, consent was collected in ways that were not "freely given, specific, informed, and unambiguous":
- Pre-ticked boxes
- Bundle consent with service access ("accept to continue")
- Withdrawal harder than giving consent
- No granularity (one tick for everything)
Developer action: Implement consent with: active opt-in (not pre-ticked), per-purpose granularity, equally easy withdrawal, no service degradation for refusal, timestamped consent records.
Pattern 5: Inadequate Sub-Processor DPAs
The Art.28 case history (including the Irish DPC's enforcement against multiple platforms) shows that missing or inadequate DPAs with sub-processors create direct Art.28 violations.
Developer action: Maintain an active sub-processor register. Verify each sub-processor has a valid DPA covering the eight mandatory Art.28(3) clauses. Update the register when you add new tools.
Pattern 6: Breach Notification Delays
Article 33 requires notification to the DPA within 72 hours of becoming aware of a personal data breach. Multiple enforcement actions have resulted in fines where the breach itself was less serious but the notification delay was the primary violation.
Developer action: Implement a breach detection and notification workflow before you need it. The 72-hour clock starts when anyone in your organization becomes aware, not when your legal team finishes the assessment. The regulation explicitly allows for an initial notification followed by supplementary information.
2025–2026 Enforcement Priorities
Based on EDPB work programmes and individual DPA announcements, the active enforcement priorities through 2026 are:
AI Training Data: DPAs across the EU are investigating whether personal data is being used to train ML models without a valid lawful basis. The EDPB issued guidance in early 2025 on this question, and several investigations are ongoing. SaaS companies that use user-generated content to fine-tune models or improve features face Art.6 scrutiny.
Children's Data Verification: Following the TikTok fine, DPAs are looking at whether platforms with child users are implementing adequate age verification and applying Art.8 protections. The UK ICO's Children's Code enforcement actions are being mirrored across the EU.
Cookie Consent and Tracking: The IAB TCF consent framework has been found non-compliant by multiple DPAs. Real-time bidding and behavioral advertising using the TCF framework remain under active enforcement. Cookie consent banners that use dark patterns are being targeted.
CLOUD Act and Data Transfers: Post-Schrems II, the DPF provides a mechanism for transfers to the US — but several member state DPAs have indicated they view the DPF as fragile and are requiring TIAs even for DPF-covered transfers. The structural CLOUD Act problem has not been resolved by the DPF.
Biometric Data and Facial Recognition: Following Clearview, any commercial use of facial recognition in the EU requires a legal basis that currently does not exist for private commercial purposes. This includes CCTV systems with facial recognition features.
The Developer Checklist: What to Fix Before Your Next Audit
Based on the enforcement record, the following are the highest-priority technical fixes for SaaS developers:
| Priority | Item | Relevant Enforcement |
|---|---|---|
| P0 | Verify all EU personal data stays in EU infrastructure | Meta, Uber |
| P0 | Implement Privacy by Default for all configurable settings | TikTok |
| P0 | Audit and fix consent collection: active opt-in, granular, revocable | |
| P0 | Complete DPAs with all sub-processors | GDPR Art.28 enforcement |
| P1 | Document lawful basis with three-part LI test where LI is used | |
| P1 | Implement 72h breach notification workflow | Art.33 enforcement |
| P1 | Conduct DPIA for high-risk processing (biometric, special category, large scale) | Clearview |
| P1 | Review AI training data practices for valid lawful basis | 2025 EDPB priorities |
| P2 | Complete Transfer Impact Assessments for any US data flows | Meta, Uber |
| P2 | Verify age verification for services accessible to minors | TikTok |
| P2 | Audit third-party cookie consent mechanism against IAB TCF findings | 2025 priorities |
What EU-Native Infrastructure Removes From the Equation
The single most recurring element in GDPR enforcement actions is the US infrastructure problem. Meta, Uber, and every Schrems II case trace to the same root cause: personal data flowing to a jurisdiction where US law overrides contractual protections.
EU-native infrastructure eliminates this category of risk:
- No Chapter V transfer occurs when data stays in the EU
- No TIA obligation (no transfer, no assessment needed)
- No CLOUD Act exposure (EU-incorporated providers outside US jurisdiction)
- No DPF fragility risk (DPF only needed for US transfers)
This does not eliminate all GDPR obligations — consent, transparency, lawful basis, DPAs, and Art.25 apply regardless of where infrastructure is located. But it removes the highest-fine category entirely.
sota.io is an EU-sovereign platform-as-a-service: EU-incorporated, operated from EU data centers, with no US parent company access to customer data. Applications deployed on sota.io inherit EU data residency without additional architecture work.
Key Takeaway
GDPR enforcement is not random. The five cases above — totalling over €2.1 billion — all resulted from specific technical decisions that were made at design time: where to store data, what defaults to set, how to implement consent, whether to complete DPAs with processors, and whether to build notification workflows. Every one of these was fixable before the fine was issued.
The enforcement record from 2023–2026 gives SaaS developers a detailed, prioritized work list. The companies that escaped major fines in this period were not lucky — they made different technical decisions. The cases above show exactly which decisions those were.
This post is part of the sota.io EU Cyber Compliance Series — 867 guides covering GDPR, NIS2, the EU AI Act, the Cyber Resilience Act, and related EU digital regulation. Each guide is written for software developers and explains the technical implementation requirements behind the legal text.
EU-Native Hosting
Ready to move to EU-sovereign infrastructure?
sota.io is a German-hosted PaaS — no CLOUD Act exposure, no US jurisdiction, full GDPR compliance by design. Deploy your first app in minutes.