GDPR Art.8: Children's Consent in Digital Services — Age Thresholds, Parental Authorization & Verification (2026)
Post #442 in the sota.io EU Cyber Compliance Series
Art.8 is the GDPR provision that every developer building a consumer-facing product must understand before writing a single line of sign-up flow code. Where Art.7 governs how consent must be obtained from anyone, Art.8 adds a prior gate: if the person giving consent is a child, the standard consent rules are insufficient — you either need the child to meet the applicable age threshold, or you need verifiable parental authorisation. Fail this gate and there is no lawful basis under Art.6(1)(a) at all, and all processing based on that consent is unlawful under Art.5(1)(a).
This guide is part of the GDPR Chapter II series: Art.1-4 Scope → Art.5 Principles → Art.6 Lawful Bases → Art.7 Consent Conditions → Art.8 Children's Consent (this guide) → Art.9 Special Categories.
Art.8 at a Glance
| Art.8 | Rule | Core Obligation |
|---|---|---|
| Art.8(1) | Age threshold | Consent valid only if child ≥ applicable age (16 default, MS may lower to 13) |
| Art.8(1) | Below threshold | Controller must obtain and verify parental/guardian authorisation |
| Art.8(2) | Verification obligation | "Reasonable efforts" to verify age and parental status — proportionate to risk |
| Art.8(3) | Existing contract law | Art.8 does not affect general contract law in Member States (e.g., age of majority for contractual capacity) |
Art.8 applies only to information society services where consent is used as the lawful basis. It does not apply to:
- Processing under Art.6(1)(b) (contract — a child can be a contractual party under national law)
- Processing under Art.6(1)(c) (legal obligation)
- Services not directed at children or where children are not the primary user base (though you still must age-gate if children could sign up)
What Is an "Information Society Service"?
Art.8 applies specifically to the offer of information society services directly to a child. GDPR does not define ISS but cross-references Directive (EU) 2015/1535, which defines ISS as:
Any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services.
In practice, every SaaS product, app, or online platform is an ISS. This covers:
- Social media platforms, messaging apps, gaming platforms
- SaaS products where the end user registers with an email address
- E-commerce (though here contract law under Art.8(3) also applies)
- Streaming services, content platforms, educational tools
Two tests to determine if Art.8 applies to your service:
- Direct offer test: Is the service offered directly to users who could be children? If children can register without institutional intermediation, yes.
- Likely to attract test (EDPB): Even if not marketed to children, if the service is likely to be accessed by children (e.g., a gaming platform marketed to adults but played by teenagers), Art.8 applies.
The "offered directly" language is important: Art.8 does not apply to B2B services where the controller processes employee/business-contact data, even if an individual employee happens to be under 16. It applies to services where the individual user (child) is the direct contracting party or recipient.
Art.8(1) — The Age Threshold Rule
Art.8(1) states:
Where Art.6(1)(a) applies, in relation to the offer of information society services directly to a child, the processing of the personal data of a child shall be lawful where the child is at least 16 years old. Where the child is below the age of 16 years, such processing shall be lawful only if and to the extent that consent is given or authorised by the holder of parental responsibility over the child.
Member State Minimum Age Table
Member States may lower the threshold to a minimum of 13 years. As of 2026:
| Country | Applicable Age | Legal Basis |
|---|---|---|
| Germany (DE) | 16 | No national derogation — GDPR default applies |
| France (FR) | 15 | Loi Informatique et Libertés Art.45 |
| Netherlands (NL) | 16 | Uitvoeringswet AVG Art.5 |
| Austria (AT) | 14 | Datenschutz-Anpassungsgesetz 2018 §4 |
| Ireland (IE) | 16 | Data Protection Act 2018 §31 |
| Spain (ES) | 14 | LOPDGDD Art.7 |
| Italy (IT) | 14 | Codice Privacy (D.Lgs. 196/2003) Art.2-quinquies |
| Sweden (SE) | 13 | Dataskyddslagen (2018:218) §2 |
| Belgium (BE) | 13 | Loi relative à la protection des données Art.8 |
| Poland (PL) | 16 | Ustawa o ochronie danych osobowych Art.7 |
| UK (post-Brexit) | 13 | UK GDPR Art.8 + Age Appropriate Design Code |
Practical rule for multi-jurisdiction products: If your product is available in all EU Member States, you must implement the highest applicable threshold (16) unless you geo-gate by country and apply the country-specific age. Most consumer products operating pan-EU use 16 as the default to avoid per-jurisdiction complexity.
What "Parental Responsibility" Means
Art.8 requires consent from "the holder of parental responsibility" — not necessarily a biological parent. This includes:
- Legal guardians appointed by a court
- Adoptive parents
- Foster parents with legal custody
- Any person with legal responsibility for the child under national family law
Art.8(2) — The Verification Obligation
Art.8(2) creates an obligation on the controller: you must make "reasonable efforts" to verify that (a) the user is above the applicable age threshold, or (b) parental consent has been given. What counts as "reasonable efforts" is proportionate to the risk and the technology available.
The EDPB's Guidelines on Children's Data (expected 2024-2025) and the UK ICO's Age Appropriate Design Code (applied analogously in EU practice) give the clearest guidance on verification adequacy.
Verification Tiers by Risk Level
Tier 1 — Low-risk services (free, no financial transactions, no special categories):
- Self-declaration of age (DOB entry at signup) + clear warning about age requirement
- Email-based parental consent flow (send consent request to parent's email)
- This tier is acceptable for services with low risk of harm and no special category processing
Tier 2 — Medium-risk services (social features, content upload, location, profiling):
- Age estimation via AI (facial analysis) — controversial; EDPB has reservations about accuracy and special category risks under Art.9 (biometric processing)
- Credit card ownership check (heuristic — not conclusive but strong signal)
- Third-party age verification service (TrustID, AgeID, Yoti) — token-based age assertion without sharing DOB
- Bank account ownership verification via Open Banking (where available)
Tier 3 — High-risk services (health data, financial services, location tracking, minors as primary audience):
- Government ID verification (eIDAS-compliant identity assertion)
- Parental consent via signed document or video verification
- School/institution intermediation for EdTech
The Double-Opt-In Parental Consent Flow
The most common compliant implementation for consumer apps targeting Tier 1-2:
- User enters date of birth during signup
- If DOB indicates age < threshold: block immediate access, trigger parental consent flow
- Send email to parental email address provided by child: "Your child [name] wants to create an account. Please confirm you give consent."
- Parent clicks confirmation link (time-limited, HTTPS, one-time token)
- Confirmation triggers account activation + consent log record
- Log: child_id, child_dob, parent_email, consent_timestamp, consent_mechanism="email_double_opt_in", parental_consent_text_version
Known limitation: A child could provide a parent's email address that they control (e.g., by creating a parent email account). The EDPB acknowledges this and holds that reasonable efforts — not perfect certainty — is the standard. If you add a secondary check (e.g., check that parent email domain is not the same free provider just created) you exceed the baseline obligation.
Python ChildConsentGate Implementation
from dataclasses import dataclass
from datetime import date, datetime
from enum import Enum
import hashlib
import secrets
class ConsentStatus(Enum):
ADULT = "adult"
CHILD_PARENTAL_REQUIRED = "child_parental_required"
CHILD_PARENTAL_PENDING = "child_parental_pending"
CHILD_PARENTAL_CONFIRMED = "child_parental_confirmed"
BLOCKED = "blocked"
# Member state minimum ages — conservative defaults where derogation exists
MEMBER_STATE_AGE_THRESHOLDS: dict[str, int] = {
"DE": 16, "FR": 15, "NL": 16, "AT": 14, "IE": 16,
"ES": 14, "IT": 14, "SE": 13, "BE": 13, "PL": 16,
"DK": 13, "FI": 13, "LU": 16, "PT": 13, "CZ": 15,
"UK": 13, # post-Brexit, UKGDPR Art.8
"DEFAULT": 16, # EU GDPR Art.8(1) default
}
@dataclass
class ChildConsentRecord:
user_id: str
date_of_birth: date
country_code: str
status: ConsentStatus
parental_email: str | None
parental_consent_token: str | None
parental_consent_token_expires: datetime | None
parental_consent_confirmed_at: datetime | None
consent_log_version: str = "v1"
def age_at_signup(self) -> int:
today = date.today()
return (today - self.date_of_birth).days // 365
def applicable_threshold(self) -> int:
return MEMBER_STATE_AGE_THRESHOLDS.get(
self.country_code, MEMBER_STATE_AGE_THRESHOLDS["DEFAULT"]
)
def requires_parental_consent(self) -> bool:
return self.age_at_signup() < self.applicable_threshold()
def evaluate_child_consent_gate(
user_id: str,
date_of_birth: date,
country_code: str,
parental_email: str | None = None,
) -> ChildConsentRecord:
record = ChildConsentRecord(
user_id=user_id,
date_of_birth=date_of_birth,
country_code=country_code,
status=ConsentStatus.BLOCKED,
parental_email=None,
parental_consent_token=None,
parental_consent_token_expires=None,
parental_consent_confirmed_at=None,
)
if not record.requires_parental_consent():
record.status = ConsentStatus.ADULT
return record
if parental_email is None:
record.status = ConsentStatus.CHILD_PARENTAL_REQUIRED
return record
# Generate one-time parental consent token (72h TTL)
token = secrets.token_urlsafe(32)
record.parental_email = parental_email
record.parental_consent_token = hashlib.sha256(token.encode()).hexdigest()
record.parental_consent_token_expires = datetime(
*datetime.utcnow().timetuple()[:3]
).replace(hour=23, minute=59, second=59)
record.status = ConsentStatus.CHILD_PARENTAL_PENDING
# Caller is responsible for sending the email with the raw token
return record, token
def confirm_parental_consent(
record: ChildConsentRecord,
submitted_token: str,
) -> ChildConsentRecord:
if record.status != ConsentStatus.CHILD_PARENTAL_PENDING:
raise ValueError("Record is not in pending parental consent state")
token_hash = hashlib.sha256(submitted_token.encode()).hexdigest()
if token_hash != record.parental_consent_token:
raise ValueError("Invalid parental consent token")
if datetime.utcnow() > record.parental_consent_token_expires:
raise ValueError("Parental consent token has expired")
record.status = ConsentStatus.CHILD_PARENTAL_CONFIRMED
record.parental_consent_confirmed_at = datetime.utcnow()
return record
What to log in your database:
CREATE TABLE child_consent_records (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id),
dob_year SMALLINT NOT NULL, -- store year only, not full DOB
dob_month SMALLINT NOT NULL,
country_code CHAR(2) NOT NULL,
applicable_age_threshold SMALLINT NOT NULL,
status TEXT NOT NULL, -- adult | parental_required | parental_pending | parental_confirmed
parental_email_hash TEXT, -- SHA-256 of parental email (not plaintext)
consent_token_hash TEXT,
token_expires_at TIMESTAMPTZ,
parental_confirmed_at TIMESTAMPTZ,
consent_text_version TEXT NOT NULL DEFAULT 'v1',
created_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
Why store DOB as year+month and not full date: Full DOB is considered special-category-adjacent in some interpretations (especially combined with name/location it becomes highly identifying). Year+month is sufficient to calculate age at signup and reduces sensitivity.
EDPB and Supervisory Authority Guidance
EDPB Guidelines 05/2020 on Consent cover Art.8 at paras. 67-77 and clarify:
- "Reasonable efforts" is contextual — a high-risk service for children must do more than a low-risk one
- Self-declaration alone is not "reasonable efforts" for services likely to attract children
- Age estimation via facial analysis counts as biometric processing under Art.9(1) — need an Art.9(2) exception in addition to Art.8 compliance
- Controllers must provide child-friendly privacy notices (Art.12 + Art.13 obligations apply in age-appropriate language)
UK ICO Children's Code (Age Appropriate Design Code): Though UK-specific post-Brexit, this code has been highly influential on EU practice. Its 15 standards (including "best interests of the child", "data minimisation by default", "no nudge techniques") are increasingly expected by EU supervisory authorities even without formal EU binding status.
Irish DPC enforcement against TikTok (September 2023, €345M fine):
- TikTok's "Family Pairing" feature allowed adults to link with any child account without the child's knowledge
- Default settings on child accounts were public
- Processing of child personal data without adequate parental consent verification
- Fine: €345M under Art.8 + Art.5(1)(f) (integrity/confidentiality principle)
Belgian APD action against Meta (2022):
- Instagram accounts for children defaulted to public
- Birthday data used for ad targeting without verifiable parental consent
- Art.8 violation combined with Art.5(1)(a) (fairness)
COPPA Comparison: US COPPA vs EU GDPR Art.8
| Dimension | US COPPA (FTC Rule) | EU GDPR Art.8 |
|---|---|---|
| Age threshold | Under 13 (uniform, no state variance) | 13-16 (MS-dependent; 16 default) |
| Trigger | Service "directed to children" or has actual knowledge child is under 13 | Offer of ISS where consent is lawful basis |
| Parental consent mechanism | Email plus (email + other verification) or more robust methods | "Reasonable efforts" — proportionate |
| Verifiable parental consent | Specific FTC-approved methods (credit card check, video consent, government ID) | No prescribed list — proportionality test |
| Privacy notice requirement | Child-specific COPPA notice | General GDPR Art.13/14 + child-friendly language |
| Penalty | Up to $51,744 per violation per day (2024 figure) | Up to 4% global annual turnover |
| Regulator | FTC (US) | Member State SAs + EDPB |
| B2B exemption | Not applicable (COPPA is consumer-focused) | Art.8 applies only to direct ISS offer — B2B excluded |
Key difference: COPPA is a consent-bypass regulation (you simply cannot offer the service to under-13s without COPPA compliance); GDPR Art.8 is a consent-quality regulation (children CAN use services but consent must meet the Art.8 standard). This means GDPR Art.8 is potentially more flexible — a child above the applicable national threshold can consent autonomously.
Dual-jurisdiction products (EU+US): If your product is available in both markets, the stricter rule applies per user:
- US user under 13 → COPPA applies (cannot use service without parental consent under FTC rules)
- EU user aged 14 in Spain → can consent autonomously (ES threshold = 14)
- EU user aged 14 in Germany → parental consent required (DE threshold = 16)
EdTech Implications: Art.8 in School Contexts
Schools deploying SaaS tools for student use create a triangular relationship: the school (data controller or processor), the teacher (data processor or sub-processor), and the student (data subject). Art.8 compliance in EdTech requires:
When the school is the controller (most common):
- The school provides consent on behalf of students through institutional authorization under Art.6(1)(c) (legal obligation to provide education) or Art.6(1)(e) (public task)
- Art.8's parental consent requirement is typically displaced by the institutional legal basis — Art.8 only applies where Art.6(1)(a) (consent) is the lawful basis
- However: if the EdTech tool also processes data for the vendor's own purposes (analytics, product improvement), Art.8's parental consent is required for that additional processing
When the SaaS vendor is the controller:
- If students register directly with their personal email (not via school SSO), Art.8 applies in full
- Age verification via school email domain is a reasonable mitigation:
@school.edu.dedomains indicate institutional context - Best practice: EdTech products requiring direct student registration should use school SSO (Google Workspace for Education, Microsoft 365 Education) to route consent through the institution
GDPR + DSK guidance for EdTech (Germany): The German Datenschutzkonferenz (DSK) has issued guidance that cloud tools used in schools must comply with GDPR including Art.8 where applicable. Schools in DE must verify that vendor DPAs are in place and that the vendor's own use of student data does not trigger Art.8 independently.
Art.8 + Art.7 Combined Obligations
When a child above the applicable threshold gives consent, all Art.7 conditions still apply:
- Art.7(1) Proof burden: The consent log must include the DOB or age assertion that showed the child was above threshold, plus the mechanism used to verify it.
- Art.7(3) Withdrawal: Children must be able to withdraw consent as easily as they gave it — the withdrawal mechanism must be accessible without adult assistance.
- Art.7(4) Bundling prohibition: A child cannot be required to consent to marketing or profiling as a condition of accessing the core service.
Child-friendly transparency under Art.12: Art.12(1) requires information provided to data subjects to be in a "concise, transparent, intelligible and easily accessible form, using clear and plain language." For children, this means age-appropriate language — plain text, no legal jargon, visual aids where possible. The EDPB recommends separate child-specific privacy notices rather than a single adult-targeted document.
Practical Checklist for Developers
| Checkpoint | What to verify |
|---|---|
| ISS determination | Does Art.8 apply to your service? (Consumer + consent-based = yes) |
| Applicable threshold | Which countries do you operate in? Use highest threshold unless geo-gating |
| Age collection | DOB collected at signup? Stored with year/month only? |
| Below-threshold flow | Parental consent flow implemented? Email double-opt-in minimum? |
| Consent log | Log includes: DOB evidence, country code, threshold applied, parental email hash if applicable, confirmation timestamp |
| Withdrawal | Children can withdraw consent without adult assistance? |
| Privacy notice | Child-friendly version exists? Age-appropriate language? |
| Art.7 compliance | Proof burden, no bundling, easy withdrawal — all apply in addition to Art.8 |
| Parental consent record retention | Retain for processing duration + 3 years minimum |
| B2B check | Confirm Art.8 doesn't apply to your B2B service unless children could be direct users |
Common Mistakes
Mistake 1: Assuming 13 is the EU minimum and applying it pan-EU. The GDPR minimum is 13 but the default is 16. Only Member States that have explicitly legislated may use a lower age. If you serve DE, NL, PL, or IE users, the threshold is 16.
Mistake 2: Self-declaration as sufficient verification. Entering a date of birth in a form is not "reasonable efforts" for services likely to attract children. For consumer social apps, gaming, or any service with a primarily young user base, additional verification is required.
Mistake 3: Treating Art.8 as applicable only to services "for children." Art.8 applies wherever children could register. A project-management SaaS marketed to teams is unlikely to attract under-16s; a social networking or gaming platform almost certainly will.
Mistake 4: Parental consent as a one-time event. If you change your privacy notice material terms (new processing purposes, new data categories), existing parental consents may not cover the new processing. Re-consent logic must account for parental consent records when processing changes.
Mistake 5: Forgetting Art.8(3) — national contract law still applies. Art.8 governs consent under GDPR, but a 14-year-old may lack contractual capacity under national law. In Germany, contracts with minors under 18 generally require parental consent under §107-113 BGB — this is a separate obligation from GDPR Art.8.
See Also
- GDPR Art.7: Conditions for Consent — the four conditions that apply to all consent including child consent
- GDPR Art.5: Six Principles — fairness and transparency obligations amplified for child data
- GDPR Art.9: Special Categories — age estimation via biometrics triggers Art.9 in addition to Art.8
- GDPR Art.35: DPIA — processing child data at scale typically triggers mandatory DPIA