EU Digital Services Act 2024: What Every Hosting Provider and Developer Needs to Know
The EU Digital Services Act — Regulation (EU) 2022/2065 — became fully applicable on 17 February 2024. Most developers associate the DSA with large social media platforms and content moderation. That association is misleading. The DSA applies to every service that stores and provides access to user-uploaded content — including PaaS platforms, SaaS tools with user workspaces, code hosting services, and application backends that serve user-generated data to the public.
If your service lets users upload anything that is then accessible to other users or the public, you are a "hosting service" under the DSA definition. This guide covers what that means in practice for developers and infrastructure providers.
What Is the DSA?
Regulation (EU) 2022/2065 on a Single Market For Digital Services was signed on 19 October 2022, entered into force on 16 November 2022, and became fully applicable to all providers on 17 February 2024.
The DSA replaces the relevant liability framework from the 2000 eCommerce Directive (Directive 2000/31/EC) and introduces tiered obligations based on the scale and nature of services.
The DSA distinguishes four categories of "intermediary services":
| Category | Definition | Example |
|---|---|---|
| Mere conduit | Transmits information without storage | ISP, VPN provider |
| Caching | Temporary automatic storage for transmission efficiency | CDN, proxy cache |
| Hosting service | Stores information at the request of a user | PaaS, SaaS, cloud storage, forums |
| Online platform | Hosting service that also disseminates info to the public | App stores, marketplaces, social media |
| Very Large Online Platform (VLOP) | Online platform with ≥45M monthly active users in the EU | Facebook, TikTok, YouTube, Amazon Marketplace |
PaaS and SaaS providers typically fall in the hosting service category. If your platform also surfaces user content to other users (e.g., a marketplace, a public repository host, a community tool), you are an online platform subject to additional obligations.
The Tiered Obligation Structure
The DSA's obligations scale with provider size:
All intermediary services (including hosting services, any size):
- Art. 11: Single point of contact for authorities and users
- Art. 12: Single point of contact for users (can overlap with Art. 11)
- Art. 13: Transparency reporting (annual for micro/small providers if applicable)
- Art. 14: Terms and conditions content requirements
Hosting services specifically (Art. 16–17):
- Art. 16: Notice-and-action mechanism for illegal content
- Art. 17: Statement of reasons for content moderation decisions
Online platforms (in addition to above):
- Art. 19: Internal complaint-handling system
- Art. 20: Out-of-court dispute settlement
- Art. 21: Trusted Flaggers
- Art. 24: Online interface transparency
VLOPs and VLOSEs (very large platforms/search engines ≥45M users):
- Art. 34: Systemic risk assessment (annual)
- Art. 35: Mitigation measures
- Art. 37: Independent auditing
- Art. 40: Data access for researchers
The key practical insight for most developers: If you are building a small-to-medium hosting service or SaaS, Articles 11–17 are your operative obligations. The VLOP obligations that dominate DSA coverage in the press are a different tier.
Micro and Small Providers: A Real Exemption
Articles 19–28 (internal complaint handling, out-of-court dispute settlement, Trusted Flagger obligations for online platforms) explicitly do not apply to micro and small enterprises as defined in the Commission Recommendation 2003/361/EC:
- Micro enterprise: fewer than 10 employees AND annual turnover or balance sheet ≤ €2M
- Small enterprise: fewer than 50 employees AND annual turnover or balance sheet ≤ €10M
This exemption matters. Most early-stage SaaS companies and developer tools fall into the micro or small category. Your obligations are substantially lighter — but they are not zero.
What micro/small hosting services still need (no exemption):
- Art. 11: Single point of contact for member state authorities
- Art. 13: Transparency report (annual, but simplified)
- Art. 16: Notice-and-action mechanism for illegal content reports
- Art. 17: Statement of reasons when you take down or restrict content
Article 16: Notice-and-Action — The Core Mechanism
Article 16 defines how your users and third parties can report illegal content to you, and what you must do when they do.
What you must provide:
- An "easy-to-access and user-friendly notification mechanism" that allows any individual or entity to notify you of specific pieces of allegedly illegal content
- The mechanism must enable submitters to explain why they believe the content is illegal, identify themselves (optional but encouraged), and provide a specific URL
What you must do upon receiving a notice:
- Process the notice "in a timely, diligent, and objective manner"
- Take a decision on the notified content
- Where relevant, inform the notifier of your decision and the possibility to contest it
- Where you remove or restrict content: inform the affected user (Art. 17)
What "illegal content" means: The DSA does not define new categories of illegal content. "Illegal content" means content that is illegal under EU law or the law of a member state — CSAM, terrorist content, copyright-infringing material, fraudulent commercial communications, defamation under national law, etc. The DSA does not require you to proactively monitor content (Art. 8 explicitly prohibits general monitoring obligations), but you must act on notices.
Implementation in practice:
A minimal compliant implementation is an email address or web form where notices can be submitted, reviewed by a human, and responded to. A dedicated abuse@yourservice.com with documented SLA and a decision log satisfies the mechanism requirement.
Article 17: Statement of Reasons
When you remove, restrict, suspend, or terminate access to user content or accounts, Article 17 requires you to inform the affected user with a "clear and specific statement of reasons." This must include:
- Which specific terms of service or legal basis the decision relies on
- The geographic scope of any restriction
- Whether you used automated means to make the decision
- Information about redress options (internal complaint mechanism if you have one, out-of-court dispute settlement if applicable)
For most small hosting providers: a brief email explaining why you removed content or suspended an account, citing the specific clause of your terms, satisfies Article 17.
Article 13: Transparency Reporting
All providers of intermediary services must publish transparency reports on their content moderation activities. For most non-VLOP providers, this is annual and must cover:
- Number of orders received from authorities regarding illegal content
- Number of notices received via the Art. 16 mechanism, and how many were acted upon
- Number of content removals, account suspensions, and their legal basis
- Where automated tools were used for content moderation
- Error rates of automated moderation tools
For micro/small providers with minimal content moderation activity, a simple annual public document covering these data points is sufficient.
Country of Establishment and DSA Coordinators
The DSA uses the country of establishment principle. Your obligations are enforced by the Digital Services Coordinator (DSC) of the EU member state where you are established:
| Country | Digital Services Coordinator |
|---|---|
| Germany | Bundesnetzagentur (Federal Network Agency) |
| France | ARCOM (Autorité de Régulation de la Communication Audiovisuelle et Numérique) |
| Ireland | Coimisiún na Meán |
| Netherlands | Autoriteit Consument en Markt (ACM) |
| Sweden | Post- och telestyrelsen (PTS) |
| EU-wide (VLOPs) | European Commission (DG CNECT) |
The country-of-establishment principle is why many large tech companies incorporated in Ireland face Coimisiún na Meán as their DSC — Ireland was a common incorporation choice under the predecessor eCommerce Directive.
For US-based providers serving EU users: You are subject to DSA obligations if EU users can access your service. If you have no EU establishment, you must appoint an EU legal representative (Art. 13(2)) who can be held liable for non-compliance. This parallels the GDPR representative requirement.
The CLOUD Act Intersection
The DSA does not override foreign law — and this is where US-incorporated providers face a structural tension.
When a US-incorporated service receives a DSA-compliant notice for illegal content under EU law, the provider must also handle any conflicting US obligations. The US CLOUD Act (18 U.S.C. § 2713) requires US providers to produce stored data to US law enforcement regardless of where the data is located.
The practical conflict:
- A US provider receives a DSA Art. 16 notice about content that is illegal under EU law (e.g., content illegal under German NetzDG or French hate speech law)
- Complying with the EU notice by removing content is straightforward
- But the US provider's underlying data (logs, user information, metadata) remains accessible to US authorities via CLOUD Act subpoenas regardless of DSA compliance
- EU users on a US-incorporated PaaS are therefore subject to two parallel legal regimes simultaneously
EU-incorporated providers operate under a single legal framework: EU law, enforced by EU DSCs, with no CLOUD Act exposure. For developers building services that will handle DSA notices — abuse reporting systems, content moderation workflows, user data — the jurisdiction of your infrastructure provider matters.
DSA Obligations Checklist for Small Hosting Providers
For a micro or small PaaS, SaaS, or hosting service:
| Obligation | Article | What to implement |
|---|---|---|
| Single point of contact | Art. 11 | Dedicated email or contact page for authority requests |
| Terms of service | Art. 14 | Clear T&C listing prohibited content types and your enforcement process |
| Notice-and-action mechanism | Art. 16 | Abuse reporting form or email with documented review process |
| Statement of reasons | Art. 17 | Template email for content removal/account suspension decisions |
| Transparency report | Art. 13 | Annual public document covering content moderation statistics |
What you do NOT need as a micro/small provider:
- Internal complaint-handling system (Art. 19) — platform obligation only, exempted for micro/small
- Out-of-court dispute settlement body (Art. 21) — platform obligation only, exempted for micro/small
- Trusted Flagger programme (Art. 22) — platform obligation only, exempted for micro/small
- Systemic risk assessments (Art. 34) — VLOP/VLOSE only
What "Hosting Service" Means for PaaS
A PaaS provider that hosts application code, databases, and user workspaces is a hosting service under DSA. However, the practical content moderation obligations arise primarily when user-uploaded content is accessible to the public or other users.
Consider the scope:
Content clearly in scope:
- User-uploaded files accessible via public URLs
- Application output that serves user-generated data to the public
- Logs or artifacts from customer-deployed applications that are publicly accessible
Content in a grey zone:
- Private application data that is only accessible to the deploying customer (not third parties or the public)
- Internal application state not surfaced to any public interface
The Art. 16 notice-and-action mechanism is triggered by "specific pieces of content" — meaning a specific URL or identifier. If your PaaS only hosts private workloads with no public-facing user content, your practical exposure to Art. 16 notices is low.
DSA Enforcement and Penalties
Enforcement is by national DSCs for most obligations, and by the European Commission for VLOPs/VLOSEs.
Penalties:
- Non-compliance by non-VLOP providers: up to 6% of annual worldwide turnover
- VLOPs: up to 6% of annual worldwide turnover for substantive violations
- VLOPs (repeated systemic violations): temporary or permanent ban on operating in the EU
For small providers, the 6% figure sounds large, but enforcement typically starts with orders to comply. DSCs are expected to prioritise proportionate enforcement, with VLOPs as the primary focus.
For Developers Building on PaaS
If you are a developer deploying an application that handles user-generated content, your application may independently constitute a "hosting service" under the DSA — regardless of whether your underlying PaaS is DSA-compliant.
The DSA applies to the layer that stores and provides access to the content. If your application stores user posts and makes them accessible, your application is the hosting service, not just the infrastructure it runs on.
Practical implications for SaaS builders:
- Design your abuse reporting flow from the start — Art. 16 requires a mechanism, not retroactive triage
- Document your content moderation decisions — Art. 17 requires you to tell users why content was removed
- Build for transparency reports — log content moderation events in a structured way from day one
- Know your DSC — it is the authority in your country of establishment
Hosting your application on an EU-native PaaS (where EU law governs without CLOUD Act overlay) simplifies DSA compliance in one specific way: when you respond to a DSA notice by removing content, you are not simultaneously exposed to a US legal order requiring you to preserve or produce the same data under conflicting US authority.
Summary
The DSA creates a workable compliance framework for hosting providers of all sizes. The key points for developers:
- 17 February 2024: DSA fully applicable to all providers
- Your tier determines your obligations: Micro/small hosting services have lighter obligations than online platforms or VLOPs
- Art. 16 applies to all hosting services: You need a notice-and-action mechanism regardless of size
- Transparency reporting is annual: Even micro providers must publish a content moderation summary
- Country of establishment determines your DSC: EU-incorporated providers have a single enforcer; US-incorporated providers must appoint an EU legal representative
- CLOUD Act tension persists: DSA compliance does not eliminate CLOUD Act exposure for US-incorporated providers
For most early-stage SaaS or PaaS developers, DSA compliance is achievable with a well-documented abuse email, a clear content policy, and an annual summary document. The complexity scales with your user base and the nature of content your platform handles.
See Also
- EU Sovereignty Audit 2026: The PaaS Layer Compliance Blindspot — how to audit your stack for CLOUD Act exposure that persists even after DSA compliance
- EU Data Act 2025: Cloud Switching Rights Developer Guide — sister regulation governing cloud data portability and switching obligations
- EU Cyber Resilience Act 2027: Open-Source PaaS Developer Checklist — CRA obligations that run in parallel with DSA for software product security
- GDPR Article 25: Privacy by Design and by Default — Developer's Implementation Guide — upstream privacy obligations that inform DSA-compliant hosting choices