2026-05-04·11 min read·

Using the OpenAI API in 2026: GDPR, CLOUD Act and EU AI Act Obligations for EU Developers

OpenAI's API is the backbone of thousands of European applications. It powers customer-facing chatbots, internal productivity tools, document analysis pipelines, and code generation assistants. The developer experience is excellent. The compliance picture is complicated.

If your application sends user data — names, emails, message content, behavioral logs — to the OpenAI API, you are executing a cross-border personal data transfer to a US company. That transfer is subject to GDPR Chapter V. It is also subject to the CLOUD Act. And starting August 2, 2026, if your application surfaces AI-generated content to users or makes automated decisions, the EU AI Act adds a third layer of obligations.

This guide covers what each framework requires, what OpenAI's Data Processing Agreement actually protects, and what the compliance gaps look like in practice.


When an EU developer calls client.chat.completions.create() with user-submitted content, three regulatory frameworks are simultaneously activated:

  1. GDPR — governs the personal data transfer to OpenAI's US infrastructure
  2. CLOUD Act — determines whether the US government can compel OpenAI to disclose your users' data
  3. EU AI Act — governs your obligations as a deployer of a General-Purpose AI model starting August 2026

Understanding how these frameworks interact is essential. They do not cancel each other out — they stack.


Part 1: GDPR and the OpenAI API

Is the OpenAI API a Personal Data Transfer?

Almost certainly yes, for most real-world applications.

Article 4(1) GDPR defines personal data broadly as any information relating to an identified or identifiable natural person. If your prompt includes:

...then the API call processes personal data. The fact that you are calling an API endpoint rather than storing data in a database does not change this classification.

What Category of Transfer Is This?

OpenAI, Inc. is incorporated in Delaware and headquartered in San Francisco. Its servers are located in the United States (and Microsoft Azure regions globally). Sending personal data from an EU application to OpenAI's API constitutes a transfer to a third country under GDPR Chapter V.

For this transfer to be lawful, you need one of the transfer mechanisms in Articles 44-49:

OpenAI's current approach combines DPF participation with SCC coverage. This provides a legal basis for the transfer — but a legal basis for the transfer mechanism is not the same as protection against US government access.

What OpenAI's Data Processing Agreement Covers

OpenAI provides a DPA at privacy.openai.com/policies. Key commitments include:

By default, OpenAI states that API data is not used for model training (unlike ChatGPT consumer data, which has different terms). API inputs and outputs are retained for a limited period (currently 30 days by default) for abuse detection and then deleted.

What the DPA does not change: OpenAI's obligations under US law, including the CLOUD Act and FISA Section 702.

The CLOUD Act Problem

18 U.S.C. § 2713 — the CLOUD Act — requires US-based electronic communications service providers to comply with lawful government data demands regardless of where data is physically stored. OpenAI is a US company. Its API processes and temporarily stores your users' data. A US federal court can compel OpenAI to produce that data.

The DPF and SCCs are contractual arrangements between OpenAI and you as a data controller. They do not modify OpenAI's obligations to the US government. When a CLOUD Act order conflicts with OpenAI's contractual commitments to you, US law takes precedence.

The FISA Section 702 exposure is broader still. Under 50 U.S.C. § 1881a, US intelligence agencies can compel US electronic communications service providers — including OpenAI — to assist in foreign intelligence collection against non-US persons located outside the United States. No individual court order is required for each target. Notification rights under the DPA do not apply.

Practical risk assessment for your DPO:


Part 2: EU AI Act — Your Obligations as an OpenAI API Deployer

The EU AI Act applies to a wider set of actors than many developers realize. If you build an application that calls the OpenAI API and presents outputs to users or uses them in automated processes, you are likely a deployer under the Act — and several obligations apply starting August 2, 2026.

Who Is a "Deployer" Under the EU AI Act?

Article 3(4) defines a deployer as "a natural or legal person, public authority, agency or other body using an AI system under its authority."

Building a product on top of the OpenAI API makes you a deployer of OpenAI's GPAI (General-Purpose AI) model. OpenAI is the provider. You are the deployer.

What Applies to You Starting August 2, 2026

Article 50 — Transparency Obligations for AI-Generated Content

If your application generates AI content that is presented to users, Article 50 requires:

  1. Disclosure when users interact with AI systems — your chatbot or AI assistant must be identified as AI to users, unless it is obvious from context
  2. AI-generated content marking — content generated by GPAI models must be technically marked as machine-generated where technically feasible
  3. Synthetic media disclosure — if you generate realistic synthetic images, audio, or video, you must disclose this to users

This applies regardless of whether OpenAI is in scope as a systemic-risk GPAI provider. Your obligation as a deployer is independent of OpenAI's provider obligations.

Article 26 — Deployer Obligations for High-Risk AI Systems

If your application falls under Annex III of the EU AI Act (high-risk AI systems), deployer obligations are significantly more extensive:

Annex III high-risk categories include AI systems used in: employment (CV screening, performance monitoring), education (admissions, assessment), credit scoring, biometric categorization, critical infrastructure management, and law enforcement.

Are you "high-risk"? Many OpenAI API applications are not. A customer support chatbot, a code generation tool, or an internal search assistant typically do not fall under Annex III. But an AI-powered HR screening tool, a financial advice application, or a document scoring system that influences significant decisions about individuals likely does.

What the GPAI Code of Practice Means for Deployers

The EU AI Office published the third draft of the GPAI Code of Practice in April 2026. OpenAI is a signatory. Under the Code:

The Code of Practice does not create "compliance by API call." Signing an API contract with OpenAI does not transfer your EU AI Act obligations to OpenAI. You are responsible for your deployment.


Part 3: The 6-Point Compliance Checklist

For EU developers using the OpenAI API in 2026, the following actions are required:

☐ 1. Sign OpenAI's DPA (if not already done)

Go to platform.openai.com → Settings → Privacy → Data Processing Addendum. This formalizes OpenAI as a data processor under Art. 28 GDPR. Without a signed DPA, your use of the API to process personal data has no contractual data processing basis.

☐ 2. Update Your Article 30 Records of Processing

Your ROPA must include:

☐ 3. Conduct a DPIA for High-Risk Processing

Article 35 GDPR requires a DPIA for processing that is "likely to result in a high risk to the rights and freedoms of natural persons." Sending sensitive personal data (Art. 9 categories) to a US AI provider via API warrants a DPIA. Large-scale processing of any user data through OpenAI also likely triggers the threshold.

The DPIA must document: the transfer mechanism, the CLOUD Act residual risk, mitigations (data minimization, prompt engineering to avoid personal data exposure), and the DPO's conclusion.

☐ 4. Implement Article 50 Transparency Disclosures

Starting August 2, 2026:

☐ 5. Audit for Annex III High-Risk Classification

Review your application against EU AI Act Annex III. If any use case involves automated decision-making about individuals in employment, education, credit, or other listed categories, you face full Art. 26 deployer obligations. Engage legal counsel if classification is unclear — the penalties for non-compliance reach €15 million or 3% of global annual turnover for AI Act violations.

☐ 6. Implement Data Minimization in Your Prompts

The simplest mitigation for both GDPR transfer risk and CLOUD Act exposure is to minimize personal data in API calls. Design your prompt architecture to:

This does not eliminate the transfer, but it reduces the sensitivity and the compliance exposure.


Where Your Deployment Infrastructure Fits In

GDPR Chapter V obligations and CLOUD Act risk relate to where personal data travels, not just where it is stored. If your application is deployed on a US cloud provider, you face dual exposure: both the OpenAI API transfer and the underlying infrastructure transfer.

For applications where the transfer risk is material — regulated industries, sensitive personal data, NIS2-essential services — deploying the application layer on EU-native infrastructure reduces (but does not eliminate) your exposure:

The OpenAI API transfer itself cannot be eliminated without replacing the model. But the blast radius of the compliance exposure can be bounded by ensuring that everything except the API call stays in EU jurisdiction.


The Practical Bottom Line

Using the OpenAI API from an EU application is legally permissible under current GDPR frameworks. The DPF and SCCs provide a transfer mechanism. OpenAI's DPA covers the processor relationship.

What the current framework does not provide is immunity from US government access to your users' data when it passes through OpenAI's infrastructure. The CLOUD Act risk is real and structural. For most applications, it is an acceptable residual risk with appropriate documentation. For applications processing highly sensitive data or subject to sector-specific rules, it warrants explicit risk acceptance or architectural mitigation.

The EU AI Act layer is new as of August 2026 and requires action regardless of where you deploy. Art. 50 transparency obligations apply to every EU-facing AI application. Do not assume that using a third-party API exempts you from disclosure requirements — the Act is explicit that deployers carry independent obligations.

Start with the DPA, update your ROPA, and assess Annex III classification before August. The enforcement ramp-up for the EU AI Act's GPAI provisions begins in Q4 2026.

EU-Native Hosting

Ready to move to EU-sovereign infrastructure?

sota.io is a German-hosted PaaS — no CLOUD Act exposure, no US jurisdiction, full GDPR compliance by design. Deploy your first app in minutes.