2026-04-20·14 min read·

What 500 EU Compliance Posts Reveal About What Developers Actually Need to Know

This is post number 500. We started with GDPR Article 1 and have now covered every substantive article of every major EU digital regulation in force or coming into force by 2027. That's a lot of regulatory text. This post is about what we learned from writing it all down.

The short version: EU regulation is not what developers think it is. It is not a compliance tax imposed on businesses by Brussels bureaucrats. It is, underneath the legal language, a detailed specification of what "reasonable security practice" looks like for organisations operating digital infrastructure. Most of it, once you strip out the legalese, describes things competent engineering teams already know they should be doing.

What EU regulation adds — and this is the part developers underestimate — is accountability infrastructure. Documentation requirements. Management sign-offs. Incident timelines with legal force. Supply chain due diligence obligations. These are not technical controls. They are governance controls. And governance is where most engineering teams are underprepared.


The Numbers

500 posts. ~18 EU regulations. ~6,500 minutes of estimated reading time. Here is what they cover:

RegulationPostsCore Developer Obligation
EU AI Act (2024/1689)113Risk classification, transparency, conformity assessment for AI systems
GDPR (2016/679)~85Data processing lawfulness, security, breach notification, DPIAs
NIS2 (2022/2555)~52Security measures, incident reporting, management liability
CRA (2024/2847)~42Security by design for products with digital elements
DORA (2022/2554)~35ICT risk management, incident classification, TLPT for financial sector
Digital Omnibus (2025 proposal)~12Proposed simplification of GDPR+NIS2+CRA overlap
eIDAS 2.0 (2024/1183)~15Electronic identity wallets, trust services
DSA (2022/2065)~18Content moderation, transparency, algorithmic accountability
DMA (2022/1925)~12Gatekeeper obligations, interoperability, data portability
NIS1 (2016/1148)~8Superseded by NIS2, Oct 2024
ENISA Act (2019/881)~8Cybersecurity certification framework
Other (PSD3, FIDA, CER, etc.)~100Sector-specific digital obligations

Average reading time per post: ~12 minutes. Total for the complete regulatory library: equivalent to reading 500 medium-length technical specifications. Nobody reads all of them. That is the point of the blog — you search for the article you need, read it in 12 minutes, and get the developer-relevant obligations without the legislative context.


The Three Questions That Dominate

After 500 posts, the same three questions recur across every regulation:

Question 1: "What am I actually required to do?"

This is the most common query in every search log, every reader email, every HN comment thread. The problem: EU regulatory text specifies what must be achieved, not how to achieve it. Art.21 of NIS2 says entities must implement "appropriate technical and organisational measures". It does not say "implement MFA and SIEM and a patching policy". The directive sets the risk-based outcome; the developer infers the technical control.

The blog's most-read posts are the ones that close this gap: the "what does Art.21(2)(a) actually mean in practice" posts, the "12 controls that map to NIS2's 10 security categories" posts, the "when does GDPR Art.25 require encryption vs pseudonymisation" posts. Developers do not want to read the law. They want the executable requirements.

The insight: EU regulation is abstract specification. What developers need is implementation guidance. That gap — between regulatory text and engineering action — is where most compliance failures start.

Question 2: "How do I document this?"

Documentation is where EU compliance becomes genuinely different from US compliance culture. GDPR requires a Record of Processing Activities (Art.30). NIS2 requires documented security policies (Art.21). DORA requires an ICT risk register. AI Act requires technical documentation, conformity assessment records, and EU declarations. CRA requires a software bill of materials and vulnerability disclosure policy.

The pattern: every EU regulation has a documentation layer. The controls themselves may be similar to what NIST SP 800-53 or SOC 2 requires. The difference is that EU regulators will ask for the paper trail during an investigation — and "we do this, we just didn't write it down" is not an acceptable answer.

The insight: For EU compliance, a control that is not documented does not exist from an NCA's perspective. Build the documentation as you build the control.

Question 3: "When does this apply to me?"

Scope questions are perennially tricky. GDPR applies to any processing of EU resident data, regardless of where the processor is located. NIS2 applies to entities of a certain size operating in 18 specified sectors. CRA applies to any "product with digital elements" placed on the EU market — which includes SaaS APIs that are "imported" as components of other products. AI Act applies to any AI system used in the EU, regardless of developer location.

The threshold questions — am I "essential" or "important" under NIS2? Is my AI system "high-risk" under the AI Act? Does the CRA apply to my open-source library? — generate more reader questions than any other category.

The insight: Scope is not a one-time determination. EU regulations have overlapping application scopes, size thresholds that change as your company grows, and sector definitions that evolve with enforcement guidance. Compliance teams need to re-evaluate scope annually.


The Five Surprises in 500 Posts of EU Regulation

Writing 500 posts forces synthesis. Here are the five things that surprised us most about what EU regulation actually requires when you read all of it together.

Surprise 1: Management Liability Is Coming for CTOs

This started as a minor NIS2 provision (Art.32(6)) and has become the dominant story in EU cybersecurity enforcement in 2025–2026. NCAs can now hold management body members — including CTOs, CISOs, and board members — personally liable for security failures. The Dutch, German, and French implementations have already issued personal liability decisions in early enforcement actions.

For developers who are also founders, CTOs, or engineering leaders: you are personally exposed in a way that did not exist under NIS1. Not just the company. You.

DORA extends similar logic to ICT risk management responsibilities in the financial sector. The AI Act's Art.16 obligations on providers include personal accountability components. The trajectory is clear: EU regulation is moving from entity-level compliance to individual-level accountability.

Surprise 2: GDPR Turned Eight but Developers Still Underestimate Art.25

Article 25 — Privacy by Design and by Default — is the most technically demanding provision in GDPR and the least taken seriously by engineering teams. It requires that privacy-protective defaults be built into the design of systems before data processing begins, not bolted on afterward.

After 6 years of enforcement, EDPB guidance, and multiple significant Art.25 fines, many development teams still treat privacy as a post-launch configuration task. The 2025 enforcement data shows Art.25 violations increasing as regulators gain technical capacity to audit system architecture, not just policy documents.

The technical implications are non-trivial: end-to-end encryption, pseudonymisation at collection, default-off analytics, granular consent architecture, data minimisation in API design. These are not legal tasks. They are engineering tasks.

Surprise 3: CLOUD Act Creates Silent Conflict with Four EU Regulations Simultaneously

The US CLOUD Act (2018) allows US law enforcement to compel US-headquartered cloud providers to produce data stored anywhere in the world — including in EU data centres. This creates a silent conflict with GDPR (Art.48 prohibits disclosure without EU legal basis), NIS2 (Art.38 professional secrecy), CRA (vulnerability disclosure channel requirements), and DORA (ICT risk register confidentiality).

EU regulators are aware of this conflict. They cannot resolve it through EU law. What they have done is build international cooperation frameworks (GDPR Art.48 safeguards, NIS2 Art.44) that attempt to manage the conflict rather than eliminate it.

The only technical resolution is infrastructure that is not subject to US jurisdiction. This is not a competitive point. It is the technical reality that the GDPR, NIS2, and CRA frameworks implicitly acknowledge when they define what "adequate" data transfer protection looks like.

Surprise 4: CRA Applies to Open-Source Maintainers Who Don't Realise It

The Cyber Resilience Act's scope includes "products with digital elements" placed on the EU market. The "commercial activity" exception for open-source was narrower than the community hoped. If you maintain an open-source library that generates revenue (donations, sponsorships, consulting), integrate it into a commercial product, or distribute it as part of a service to EU customers, CRA may apply.

The SBOM requirement, the vulnerability disclosure obligation within 24 hours to ENISA, the security support period requirements — these are non-trivial compliance burdens for open-source maintainers operating individually. The EU has heard this concern. The CRA review clause will likely address it. But from August 2027, the current text applies.

Surprise 5: AI Act Art.9-11 Applies to More SaaS Teams Than They Think

The AI Act's Chapter III on high-risk AI systems includes an Annex III list of eight high-risk use case categories. Employment and workers management decisions, critical infrastructure management, educational assessment, access to essential services — these categories are broader in practice than they appear in the Annex.

If your SaaS platform assists with hiring decisions, credit decisions, educational assessments, or manages access to benefits, you are likely providing a "high-risk AI system" under the AI Act. The obligations — Art.9 risk management systems, Art.10 data governance, Art.11 technical documentation — require significant pre-deployment compliance work.

The threshold surprise: "we just provide a recommendation, the human makes the decision" is not a safe harbour. If the AI output materially influences the decision and the decision falls in Annex III, you are in scope.


What's Coming in 2026–2027

The next 100 posts will cover the regulations entering their implementation phases:

2026 — Enforcement acceleration:

2026-2027 — New frameworks:

2027 — Review and expansion:


The Takeaway for Infrastructure

Running your stack on EU-native infrastructure is not itself compliance. But it removes one of the most structurally problematic compliance conflicts in the current regulatory landscape: the CLOUD Act/GDPR/NIS2/CRA jurisdictional tension.

When you process data on infrastructure owned by EU-headquartered entities without US parent companies, three things happen:

  1. Art.48 GDPR international transfer restrictions have no US-jurisdiction trigger
  2. NIS2 Art.44 international cooperation chains do not involve CLOUD Act-subject entities
  3. CRA vulnerability disclosure channels operate without US legal intercept risk

The developer who understands the regulatory layer is the developer who survives the next audit. The engineering lead who has designed their infrastructure for compliance — not just security — is the one whose company doesn't have to rebuild its architecture when NCA auditors show up.

Five hundred posts. Same conclusion every time: know what you're required to do, document it, and build for the trajectory not the floor.


Posts 501–600: What's Next

The next milestone covers the remaining implementation challenges:

The goal remains what it was at post 1: executable requirements, not regulatory theory.


See Also