GDPR Art.5: The Six Principles of Processing — Complete Developer Guide (2026)
Post #437 in the sota.io EU Cyber Compliance Series
Art.5 is the most important single article in the GDPR for developers. Every consent form, every database schema, every retention policy, every security measure ultimately flows from these principles. The six principles in Art.5(1) plus the accountability obligation in Art.5(2) form the overarching legal framework within which all other GDPR obligations operate. Supervisory authorities use these principles as the primary lens when investigating complaints and issuing fines — most large GDPR fines reference Art.5 violations directly.
The Seven Principles at a Glance
| Art. | Principle | Core Requirement |
|---|---|---|
| Art.5(1)(a) | Lawfulness, fairness, transparency | Legal basis + no deception + inform data subjects |
| Art.5(1)(b) | Purpose limitation | Collect for specified purposes; don't repurpose incompatibly |
| Art.5(1)(c) | Data minimisation | Only what is adequate, relevant, and necessary |
| Art.5(1)(d) | Accuracy | Keep data accurate; rectify or erase inaccuracies |
| Art.5(1)(e) | Storage limitation | No longer than necessary; define retention periods |
| Art.5(1)(f) | Integrity and confidentiality | Appropriate security; protect against unauthorised access/loss |
| Art.5(2) | Accountability | Demonstrate compliance; document and justify |
Art.5(1)(a) — Lawfulness, Fairness, and Transparency
Lawfulness
Processing must have a legal basis from Art.6(1) (or Art.9(2) for special categories). The six lawful bases are:
| Basis | Art. | When to Use |
|---|---|---|
| Consent | Art.6(1)(a) | Data subject freely and specifically agreed |
| Contract | Art.6(1)(b) | Processing necessary for performance of a contract with the data subject |
| Legal obligation | Art.6(1)(c) | Processing required by EU/Member State law |
| Vital interests | Art.6(1)(d) | Life-threatening situations — narrow emergency use |
| Public task | Art.6(1)(e) | Public authorities exercising official authority |
| Legitimate interests | Art.6(1)(f) | Controller or third-party interest, not overridden by data subject rights |
Common developer mistake: Relying on legitimate interests (LI) as a default basis because it seems flexible. LI requires a three-part test (purpose test, necessity test, balancing test) and is often not available for systematic profiling, marketing, or security monitoring without careful analysis.
Key case: Fashion ID (C-40/17): Website embedding a third-party social plugin (Facebook Like button) that transmitted visitor data was co-controller for the collection and transmission stage — even without any awareness of downstream processing. The LI basis required a separate balancing test for Fashion ID's purposes.
Fairness
Processing must not be deceptive, surprising, or manipulative. Fairness goes beyond having a legal basis — it requires that processing does not adversely affect data subjects in ways they would not reasonably expect.
Fairness violations in practice:
- Collecting data for a stated purpose but actually using it for a different undisclosed purpose (dark patterns)
- Setting default settings that maximise data collection without clear disclosure
- Burying material processing information in terms of service that no reasonable person reads
- Using profiling to systematically exclude vulnerable groups from services without transparency
- Making consent effectively mandatory by denying service to those who refuse
EDPB Guidelines 3/2022 on Dark Patterns specifically address fairness: consent flows that use visual hierarchies, misleading wording, or pre-ticked boxes violate Art.5(1)(a) independently of Art.7 consent requirements.
Transparency
Data subjects must be informed about processing (Art.12-14). Transparency as an Art.5 principle means:
- Privacy notices must be written in plain language
- Processing must not be hidden or disguised
- Changes to processing must be communicated proactively
- Automated decision-making (Art.22) requires specific disclosure
The transparency principle also requires that data subjects can verify the accuracy of information about them (Art.15 access right) — making transparency and accuracy principles mutually reinforcing.
Art.5(1)(b) — Purpose Limitation
The Core Rule
Personal data must be:
- Collected for specified, explicit, and legitimate purposes
- Not further processed in a manner incompatible with those purposes
"Specified" means purposes must be defined before collection begins — you cannot collect data and decide later what to do with it. "Explicit" means communicated clearly in the privacy notice. "Legitimate" connects back to Art.5(1)(a) lawfulness.
The Compatibility Test
Further processing for a different purpose is not automatically prohibited — it depends on whether the new purpose is compatible with the original. Art.6(4) sets out the compatibility factors:
| Factor | Art.6(4) | Questions to Ask |
|---|---|---|
| Link between purposes | Art.6(4)(a) | How closely related is the new purpose to the original? |
| Context of collection | Art.6(4)(b) | What was the relationship/context when data was collected? |
| Nature of data | Art.6(4)(c) | More sensitive data = stricter compatibility analysis |
| Consequences | Art.6(4)(d) | What are the likely effects on data subjects? |
| Safeguards | Art.6(4)(e) | Are encryption, pseudonymisation, or other measures applied? |
Compatible: Processing customer transaction data to detect fraud (related to the original service purpose, expected by customers)
Incompatible: Processing customer transaction data to sell to third-party data brokers (unrelated commercial purpose, unexpected, no safeguards)
Always compatible regardless of test: Processing for archiving in the public interest, scientific or historical research, or statistical purposes — subject to Art.89 safeguards.
Developer Implications
- Define data purposes in ROPA (Art.30) before building the feature
- Document the compatibility analysis when repurposing data
- Secondary analytics must be anonymised if incompatible with collection purpose
- ML model training on operational data requires compatibility analysis or separate consent
Art.5(1)(c) — Data Minimisation
What It Requires
Data must be adequate (sufficient to serve the purpose), relevant (related to the purpose), and limited to what is necessary (minimum needed to achieve the purpose).
The "minimum necessary" test applies at collection and at every subsequent processing step. Data that was necessary at collection may become unnecessary over time — see storage limitation below.
Common Violations
| Pattern | Why It Violates Data Minimisation |
|---|---|
| Collecting date of birth when only age-check (18+) is needed | Name + date of birth is more than necessary for age verification |
| Logging full request bodies including personal data in debugging logs | Detailed logs often exceed what is necessary for error diagnosis |
| Collecting phone number as mandatory field for non-telephony services | Not relevant/necessary if the service has no phone-based functionality |
| Requiring full name where a display name suffices | Pseudonymous systems can achieve the same purpose |
| Retaining all user-generated content indefinitely "just in case" | Exceeds necessity once the operational purpose is served |
Developer Design Patterns
Privacy by Design (Art.25) operationalises data minimisation at the architecture level:
- Selective disclosure: Verify attributes (e.g., "is over 18") instead of collecting the underlying data
- Pseudonymisation by default: Use internal IDs instead of names/emails wherever possible
- Aggregation over raw data: If analytics needs are served by aggregate counts, don't store individual records
- Tiered logging: Log categories of events, not full payloads; mask personal data in debug logs
Art.5(1)(d) — Accuracy
The Requirement
Personal data must be accurate and, where necessary, kept up to date. Every reasonable step must be taken to ensure that inaccurate personal data — having regard to the purposes for which they are processed — is erased or rectified without delay.
Accuracy Obligations by Data Type
| Data Type | Accuracy Standard | Practical Implication |
|---|---|---|
| Contact details (email, address) | High — affects service delivery | Verify at collection; allow easy update; bounce-based cleanup |
| Financial data (account, credit) | Very high — affects decisions | Source-of-truth reconciliation; audit trail for corrections |
| Health/medical data (Art.9) | Very high — affects treatment | Source system authority; correction procedure with audit |
| Behavioural/preference data | Moderate — inferred data acknowledged as inferred | Label inferred vs declared; allow correction of inputs |
| Historical records | Low — accuracy at time of recording | Annotation of disputed records rather than deletion |
Rectification Right (Art.16)
Art.5(1)(d) underpins the Art.16 right to rectification. Data subjects may require correction of inaccurate personal data and completion of incomplete personal data. Developers must:
- Provide self-service correction for most profile fields
- Implement backend update propagation (correct data in all processing systems, not just the UI)
- Process third-party source corrections within the Art.12(3) one-month timeframe
Key consideration: Accuracy does not mean always reflecting the data subject's preferred narrative. A controller who believes a record is accurate but the data subject disputes it may annotate the dispute (Recital 65) rather than deleting or changing the record.
Art.5(1)(e) — Storage Limitation
The Core Rule
Personal data must be kept in a form that permits identification of data subjects for no longer than is necessary for the purposes for which the data are processed.
This principle does not specify retention periods — the controller must define them based on purpose necessity. Longer retention requires justification; indefinite retention is almost never justifiable.
Permitted Extended Retention
Art.5(1)(e) carves out extended retention for data processed solely for:
| Purpose | Safeguards Required |
|---|---|
| Archiving in the public interest | Art.89(1) technical and organisational safeguards |
| Scientific or historical research | Art.89(1) safeguards + anonymisation where possible |
| Statistical purposes | Art.89(1) safeguards; data should not be used to take decisions about individuals |
Anonymisation — where re-identification is not reasonably possible — takes data outside the GDPR entirely. Pseudonymised data (key held separately) remains personal data and subject to storage limitation.
Retention Period Design
Purpose → Legal Basis → Retention Period → Review/Deletion Mechanism
Examples of legally-grounded retention:
- Employment data: 6 years post-employment (UK) / statutory limitation period for employment claims
- Tax records: 7 years (most EU jurisdictions) — legal obligation basis
- Contract performance data: duration of contract + limitation period (typically 3-5 years)
- Marketing consent: until withdrawal; also review after 2 years of inactivity (CNIL guidance)
- Audit logs: typically 1-3 years depending on regulatory sector
Developer Implementation
- Define retention periods in ROPA (Art.30) alongside the processing purpose
- Implement automated deletion jobs triggered by retention period expiry
- Log deletion events for accountability (Art.5(2)) purposes
- Test deletion coverage: ensure cascaded deletes hit all tables/systems containing the data
- Backup deletion: ensure data deleted from live systems is also removed from backups within the retention window
Art.5(1)(f) — Integrity and Confidentiality
What It Requires
Personal data must be processed in a manner that ensures appropriate security, including protection against:
- Unauthorised or unlawful processing
- Accidental loss, destruction, or damage
Using appropriate technical and organisational measures (TOMs).
Connection to Art.32
Art.5(1)(f) is the principle; Art.32 is the implementation requirement. Art.32 requires:
- Pseudonymisation and encryption of personal data
- Ability to ensure ongoing confidentiality, integrity, availability, and resilience
- Ability to restore access to data in a timely manner after an incident
- Regular testing and evaluation of TOMs
The "appropriate" standard is risk-based: the measures must be proportionate to the risk to data subjects. High-risk processing (Art.9 data, large-scale profiling) demands stronger measures than low-risk processing.
TOMs by Risk Tier
| Risk Level | Data Type | Minimum TOMs |
|---|---|---|
| Low | General contact data, names | Encryption at rest+transit, access control, patching |
| Medium | Financial, location, behavioural | + Pseudonymisation, MFA, audit logging, penetration testing |
| High | Health, biometric, genetic (Art.9) | + End-to-end encryption, DPA (Art.28), DPIA (Art.35) |
| Critical | Large-scale Art.9 or vulnerable subjects | + Privacy-by-design architecture review, SA consultation (Art.36) |
EU Hosting and Art.5(1)(f)
Infrastructure jurisdiction directly affects integrity and confidentiality:
| Risk | EU Infrastructure | Non-EU Infrastructure |
|---|---|---|
| CLOUD Act compelled disclosure | Not applicable to EU-only providers | US-based providers subject to DOJ orders |
| FISA 702 Section 702 orders | Not applicable | US providers subject to NSA/FBI orders |
| Art.44 transfer prohibition | N/A (no transfer) | Transfers must meet Art.44-49 safeguards |
| Incident notification Art.33 | SA in establishment country | Must identify lead SA + cross-border mechanism |
For SaaS companies processing EU personal data, choosing EU-native infrastructure (such as sota.io) eliminates the structural Art.5(1)(f) risk created by non-EU provider obligations to non-EU governments — a risk that cannot be fully mitigated by SCCs or BCRs when national security orders are involved (Schrems II, C-311/18, paragraph 197).
Art.5(2) — Accountability
The Accountability Principle
The controller is responsible for compliance with Art.5(1) and must be able to demonstrate compliance.
This is not just a documentation requirement — it is a substantive obligation. The "able to demonstrate" requirement means:
- Compliance documentation must exist before a complaint or investigation
- Documentation must be accurate and up to date
- Verbal or informal compliance does not satisfy the principle
How Controllers Demonstrate Compliance
| Mechanism | Link to Art.5(2) |
|---|---|
| Records of Processing Activities (Art.30) | Documents purposes, legal bases, retention, TOMs |
| Data Protection Impact Assessments (Art.35) | Documents risk assessment for high-risk processing |
| Privacy by Design (Art.25) | Documents technical measures implementing Art.5(1) principles |
| Staff training records | Documents organisational measures |
| Data Processor Agreements (Art.28) | Documents accountability chain for processors |
| Audit logs | Documents that TOMs are actually operating |
| Incident response records | Documents Art.5(1)(f) measures and breach response |
| Consent records | Documents lawfulness basis under Art.6(1)(a) |
Accountability in Practice
Accountability operates at three layers:
Layer 1 — Internal Compliance
- Assign clear ownership for data processing activities
- Conduct regular reviews of processing purposes against actual use
- Document DPIA outcomes and decisions made
Layer 2 — Third-Party Chain
- Ensure data processing agreements with all processors (Art.28)
- Conduct due diligence on processor sub-processors (Art.28(2))
- Document transfers and the transfer mechanism (Art.44-49)
Layer 3 — Supervisory Authority
- Maintain ROPA (Art.30) — first document requested in any investigation
- Provide prompt, accurate responses to SA information requests (Art.58)
- Cooperate with Art.60 consistency mechanism in cross-border cases
Key: Accountability is the principle that makes all the other principles enforceable. Without accountability, a controller could claim compliance without substantiating it. Art.5(2) inverts the burden: the controller must prove compliance, not the SA prove non-compliance.
Python GDPRPrinciplesChecker
from dataclasses import dataclass, field
from enum import Enum
from typing import Optional
class Principle(Enum):
LAWFULNESS_FAIRNESS_TRANSPARENCY = "Art.5(1)(a)"
PURPOSE_LIMITATION = "Art.5(1)(b)"
DATA_MINIMISATION = "Art.5(1)(c)"
ACCURACY = "Art.5(1)(d)"
STORAGE_LIMITATION = "Art.5(1)(e)"
INTEGRITY_CONFIDENTIALITY = "Art.5(1)(f)"
ACCOUNTABILITY = "Art.5(2)"
@dataclass
class ProcessingActivity:
name: str
legal_basis: Optional[str] # "consent" | "contract" | "legal_obligation" | "vital_interests" | "public_task" | "legitimate_interests"
purposes: list[str]
data_categories: list[str]
retention_days: Optional[int]
encryption_at_rest: bool
encryption_in_transit: bool
access_controlled: bool
documented_in_ropa: bool
privacy_notice_updated: bool
deletion_mechanism: bool
accuracy_update_mechanism: bool
dpa_with_processors: bool
@dataclass
class PrincipleResult:
principle: Principle
passed: bool
findings: list[str] = field(default_factory=list)
recommendations: list[str] = field(default_factory=list)
class GDPRPrinciplesChecker:
def check(self, activity: ProcessingActivity) -> list[PrincipleResult]:
return [
self._check_lawfulness(activity),
self._check_purpose_limitation(activity),
self._check_data_minimisation(activity),
self._check_accuracy(activity),
self._check_storage_limitation(activity),
self._check_integrity(activity),
self._check_accountability(activity),
]
def _check_lawfulness(self, a: ProcessingActivity) -> PrincipleResult:
findings, recommendations = [], []
valid_bases = {"consent", "contract", "legal_obligation", "vital_interests", "public_task", "legitimate_interests"}
if not a.legal_basis:
findings.append("No legal basis specified")
recommendations.append("Identify Art.6(1) basis before processing begins")
return PrincipleResult(Principle.LAWFULNESS_FAIRNESS_TRANSPARENCY, False, findings, recommendations)
if a.legal_basis not in valid_bases:
findings.append(f"Invalid legal basis: {a.legal_basis}")
recommendations.append("Use one of the six Art.6(1) bases")
if not a.privacy_notice_updated:
findings.append("Privacy notice not updated for this processing")
recommendations.append("Update Art.13/14 notice to include this processing")
passed = a.legal_basis in valid_bases and a.privacy_notice_updated
return PrincipleResult(Principle.LAWFULNESS_FAIRNESS_TRANSPARENCY, passed, findings, recommendations)
def _check_purpose_limitation(self, a: ProcessingActivity) -> PrincipleResult:
findings, recommendations = [], []
if not a.purposes:
findings.append("No purposes specified")
recommendations.append("Define specific, explicit purposes before collection")
return PrincipleResult(Principle.PURPOSE_LIMITATION, False, findings, recommendations)
if len(a.purposes) == 1 and a.purposes[0].lower() in {"service", "operations", "business"}:
findings.append("Purposes are too vague — not 'specified and explicit'")
recommendations.append("Break down into concrete, granular purposes (e.g. 'fraud detection on payment transactions')")
passed = len(a.purposes) > 0 and all(len(p) > 10 for p in a.purposes)
return PrincipleResult(Principle.PURPOSE_LIMITATION, passed, findings, recommendations)
def _check_data_minimisation(self, a: ProcessingActivity) -> PrincipleResult:
findings, recommendations = [], []
HIGH_SENSITIVITY_CATEGORIES = {"health", "biometric", "genetic", "religion", "political", "sexual_orientation", "criminal"}
sensitive = [c for c in a.data_categories if any(h in c.lower() for h in HIGH_SENSITIVITY_CATEGORIES)]
if sensitive:
findings.append(f"Special category data present: {sensitive} — apply strict minimisation")
recommendations.append("Document why each special category field is strictly necessary")
passed = len(findings) == 0
return PrincipleResult(Principle.DATA_MINIMISATION, passed, findings, recommendations)
def _check_accuracy(self, a: ProcessingActivity) -> PrincipleResult:
findings, recommendations = [], []
if not a.accuracy_update_mechanism:
findings.append("No mechanism for data subjects to update/correct their data")
recommendations.append("Implement Art.16 rectification: self-service profile editing + backend propagation")
passed = a.accuracy_update_mechanism
return PrincipleResult(Principle.ACCURACY, passed, findings, recommendations)
def _check_storage_limitation(self, a: ProcessingActivity) -> PrincipleResult:
findings, recommendations = [], []
if a.retention_days is None:
findings.append("No retention period defined")
recommendations.append("Define retention period in ROPA based on purpose necessity")
return PrincipleResult(Principle.STORAGE_LIMITATION, False, findings, recommendations)
if not a.deletion_mechanism:
findings.append("No automated deletion mechanism implemented")
recommendations.append("Implement deletion job triggered at retention_days expiry")
# Flag potentially excessive retention
if a.retention_days > 2555: # > 7 years
findings.append(f"Retention of {a.retention_days} days (>{a.retention_days//365} years) — document justification")
recommendations.append("Ensure legal obligation or archiving exception applies; document in ROPA")
passed = a.retention_days is not None and a.deletion_mechanism
return PrincipleResult(Principle.STORAGE_LIMITATION, passed, findings, recommendations)
def _check_integrity(self, a: ProcessingActivity) -> PrincipleResult:
findings, recommendations = [], []
if not a.encryption_at_rest:
findings.append("Data not encrypted at rest")
recommendations.append("Implement AES-256 or equivalent for database and file storage")
if not a.encryption_in_transit:
findings.append("Data not encrypted in transit")
recommendations.append("Enforce TLS 1.2+ for all data transmission")
if not a.access_controlled:
findings.append("No access control documented")
recommendations.append("Implement RBAC; document in TOMs register")
passed = a.encryption_at_rest and a.encryption_in_transit and a.access_controlled
return PrincipleResult(Principle.INTEGRITY_CONFIDENTIALITY, passed, findings, recommendations)
def _check_accountability(self, a: ProcessingActivity) -> PrincipleResult:
findings, recommendations = [], []
if not a.documented_in_ropa:
findings.append("Processing not documented in ROPA (Art.30)")
recommendations.append("Add entry to Records of Processing Activities immediately")
if not a.dpa_with_processors:
findings.append("No data processing agreements with processors confirmed")
recommendations.append("Sign Art.28 DPAs with all processors before data is shared")
passed = a.documented_in_ropa and a.dpa_with_processors
return PrincipleResult(Principle.ACCOUNTABILITY, passed, findings, recommendations)
def report(self, activity: ProcessingActivity) -> str:
results = self.check(activity)
passed = sum(1 for r in results if r.passed)
lines = [f"GDPR Art.5 Principles Check: {activity.name}", f"Score: {passed}/{len(results)} principles satisfied", ""]
for r in results:
status = "✅" if r.passed else "❌"
lines.append(f"{status} {r.principle.value} — {r.principle.name.replace('_', ' ').title()}")
for f in r.findings:
lines.append(f" ⚠ {f}")
for rec in r.recommendations:
lines.append(f" → {rec}")
return "\n".join(lines)
# Usage example
checker = GDPRPrinciplesChecker()
user_analytics = ProcessingActivity(
name="User Behaviour Analytics",
legal_basis="legitimate_interests",
purposes=["Improve product UX by analysing navigation patterns", "Detect and prevent fraudulent account activity"],
data_categories=["user_id", "page_views", "click_events", "session_duration", "ip_address"],
retention_days=365,
encryption_at_rest=True,
encryption_in_transit=True,
access_controlled=True,
documented_in_ropa=True,
privacy_notice_updated=True,
deletion_mechanism=True,
accuracy_update_mechanism=False, # Analytics data — no correction mechanism
dpa_with_processors=True,
)
print(checker.report(user_analytics))
# Output:
# GDPR Art.5 Principles Check: User Behaviour Analytics
# Score: 6/7 principles satisfied
# ✅ Art.5(1)(a) — Lawfulness Fairness Transparency
# ✅ Art.5(1)(b) — Purpose Limitation
# ✅ Art.5(1)(c) — Data Minimisation
# ❌ Art.5(1)(d) — Accuracy
# ⚠ No mechanism for data subjects to update/correct their data
# → Implement Art.16 rectification: self-service profile editing + backend propagation
# ✅ Art.5(1)(e) — Storage Limitation
# ✅ Art.5(1)(f) — Integrity Confidentiality
# ✅ Art.5(2) — Accountability
Note on analytics data and accuracy: Behavioural analytics data is often inferred rather than declared. Controllers may label inferred data as such and provide correction of inputs (e.g. correcting an incorrectly attributed user action) rather than the inferred output — satisfying Art.5(1)(d) proportionately.
How All Seven Principles Interact
The principles are not independent checkboxes — they form an integrated framework:
Art.5(1)(a) Lawfulness → requires legal basis (Art.6) for Art.5(1)(b)–(f) to apply lawfully
Art.5(1)(b) Purpose Limitation → defines the scope within which Art.5(1)(c)–(e) operate
Art.5(1)(c) Minimisation → limits what data Art.5(1)(d)–(f) must protect and manage
Art.5(1)(d) Accuracy → interacts with Art.16 (rectification) and Art.15 (access)
Art.5(1)(e) Storage Limitation → defines when Art.5(1)(f) protection obligations end
Art.5(1)(f) Integrity → technical implementation of all other principles (security of data processed lawfully, for limited purposes, etc.)
Art.5(2) Accountability → overarching obligation to document and demonstrate all of the above
A useful shorthand: Art.5(1)(a)-(e) define what you may process; Art.5(1)(f) defines how you must protect it; Art.5(2) requires you to prove you did both.
Art.5 in Enforcement: What Gets Fined
Reviewing EDPB and SA fine decisions through 2025, Art.5 violations appear in the majority of major fines, typically in combination:
| Fine | Controller | Art.5 Violation | Amount |
|---|---|---|---|
| Meta (IAB TCF) | Meta Platforms | Art.5(1)(a) — unlawful basis for behavioural advertising | €1.2B (also Art.46) |
| TikTok Children | TikTok Ltd | Art.5(1)(a)(c) — unfair processing of minors, excess data | €345M |
| Amazon GDPR | Amazon Europe | Art.5(1)(a) — unlawful targeted advertising | €746M |
| Marriott | Marriott International | Art.5(1)(f) — inadequate security (data breach) | £18.4M |
| Google Analytics | Multiple SAs | Art.5(1)(f) — data transferred to US without adequate safeguards | €100K–€500K each |
| H&M Employees | H&M Hennes | Art.5(1)(a)(c) — excessive monitoring of employees | €35.3M |
Key enforcement pattern: Art.5(1)(a) (unlawful processing) and Art.5(1)(f) (security failure) are the most frequently cited in large fines. Purpose limitation and storage limitation violations tend to accompany lawfulness violations rather than standing alone.
Art.5 Compliance Checklist for Developers
- Every processing activity has a documented Art.6(1) legal basis in ROPA
- Privacy notice (Art.13/14) covers all current processing purposes
- No consent obtained by bundling with terms — Art.7(4) considered
- Processing purposes are specific and explicit, not general ("business operations")
- New use cases go through compatibility analysis before implementation
- ML model training assessed for purpose compatibility with original data collection
- Data fields collected are mapped to specific purposes — no "we might need it" fields
- Special category data fields (Art.9) documented with explicit necessity justification
- User-facing correction mechanism implemented (Art.16 / Art.5(1)(d))
- Retention period defined for each data category in ROPA
- Automated deletion job tested and verified to cascade to all systems
- Backup deletion included in retention policy
- Encryption at rest and in transit implemented for all personal data
- Access controls documented with role-based access model
- Art.28 DPAs signed with all processors (cloud provider, analytics, email, payment)
- ROPA (Art.30) entry created for every processing activity
- DPIA (Art.35) completed for high-risk processing
- Audit log captures deletion events for accountability evidence
GDPR Chapter I Navigation
- Art.1–4 (Scope, Definitions, Territorial Reach): GDPR Art.1–4 Developer Guide
- Art.5 (Six Principles): This post
- Art.6 (Lawful Bases): GDPR Art.6 Developer Guide
- Art.7 (Consent Conditions): Four-part consent validity test + withdrawal
- Art.9 (Special Categories): Enhanced obligations for sensitive data
Related GDPR Posts: