2026-04-29·12 min read·

AWS Step Functions EU Alternative 2026: Workflow Orchestration, CLOUD Act, and GDPR Compliance

Post #705 in the sota.io EU Compliance Series

AWS Step Functions is the standard workflow orchestration service for AWS-based applications. It coordinates multi-step processes — order fulfillment pipelines, user onboarding sequences, document processing chains, data transformation jobs, payment processing flows — by passing state between Lambda functions, ECS tasks, DynamoDB operations, and dozens of other AWS services. Step Functions tracks each transition, stores inputs and outputs, and provides a complete execution history for debugging and audit.

That execution history is where the GDPR exposure lives.

Amazon Web Services, Inc. is a Delaware corporation headquartered in Seattle, Washington. The CLOUD Act (18 U.S.C. § 2713) compels US companies to produce data stored anywhere in the world when served a valid US government order. Step Functions in eu-west-1 (Ireland) or eu-central-1 (Frankfurt) is AWS infrastructure controlled by a US entity. The controlling entity's jurisdiction determines CLOUD Act applicability — not the location of the data center.

For EU applications, the consequence is straightforward: every input passed to a workflow step, every output produced by a workflow step, and every state transition event is stored in Step Functions execution history under US jurisdiction for up to 90 days (Express Workflows) or up to one year (Standard Workflows). If any of those inputs or outputs contain personal data — and for order workflows, onboarding flows, or document processing, they almost certainly do — that personal data sits in US-controlled infrastructure subject to compelled disclosure without notification to you or your users.

What AWS Step Functions Stores Under US Jurisdiction

Execution history: the complete audit trail of your personal data flows. Step Functions' core feature is its execution history: a timestamped, step-by-step record of every state transition in a workflow execution, including the exact input and output at each step. This is what makes Step Functions debuggable and auditable — and it is also the primary GDPR risk surface.

For a Standard Workflow processing an EU user's order, the execution history would contain:

Every one of these is stored in Step Functions execution history under US jurisdiction. A CLOUD Act order served to AWS for data related to a specific investigation could produce the complete workflow history for every execution matching the requested parameters — including the personal data of EU users whose orders were processed through that workflow.

Standard Workflow retention is configurable up to 90 days by default and can be extended to 1 year through API settings. Express Workflow execution history is retained for 90 days in CloudWatch Logs. Neither type gives you granular per-execution retention control: you cannot set a 30-day retention on executions containing personal data while keeping system-health executions indefinitely. Retention is a blanket setting applied to all executions in the account or workflow.

State machine definitions. Every Step Functions state machine is defined in Amazon States Language (ASL) — a JSON-based workflow definition that specifies states, transitions, retry policies, error handling, and resource ARNs. State machine definitions are stored by AWS and versioned. While the definition itself typically does not contain personal data, it does reveal the architecture of your data processing pipeline: which Lambda functions handle which steps, which DynamoDB tables are queried, which external APIs are called, and how data flows between them.

For EU applications subject to GDPR Art.30 (Records of Processing Activities), the state machine definition is effectively a machine-readable representation of your processing pipeline. AWS holds this definition under US jurisdiction alongside the data that flows through it.

Execution inputs and outputs at rest. For Standard Workflows, Step Functions stores the full input-output payload for each state in the execution. The maximum payload size per state is 256KB. For workflows processing large documents — identity verification payloads, medical record extracts, financial transaction batches — the execution history can accumulate significant volumes of personal data across executions.

Step Functions does allow passing data via Amazon S3 for payloads exceeding 256KB (using the ResultSelector and Parameters fields to substitute S3 references). However, this pattern requires additional implementation work and is not the default behavior. Most Step Functions implementations pass data directly through the execution history.

Express Workflows and CloudWatch Logs integration. Express Workflows are designed for high-volume, short-duration workflows — typically order processing, event processing pipelines, or real-time data transformations. Unlike Standard Workflows, Express Workflows do not maintain built-in execution history. However, AWS recommends — and the Step Functions console defaults to enabling — execution logging to Amazon CloudWatch Logs.

When Express Workflow logging is enabled (which it is by default in new console-created state machines), CloudWatch receives:

CloudWatch Logs is an AWS service under US jurisdiction. Enabling Express Workflow logging — a recommended practice for production observability — moves the execution history from Step Functions into CloudWatch, but does not remove it from AWS infrastructure. The CLOUD Act exposure is identical.

CloudWatch Log Groups for Step Functions have configurable retention periods (1 day to 10 years, or indefinitely). The default is never expire. EU applications that set up Step Functions logging without explicitly configuring CloudWatch retention are accumulating personal data from workflow executions indefinitely under US jurisdiction.

AWS X-Ray tracing. Step Functions integrates with AWS X-Ray for distributed tracing across the full execution path. When X-Ray tracing is enabled, trace segments for each Step Functions execution are sent to X-Ray. These traces include:

X-Ray trace data is stored in US-controlled infrastructure for 30 days. If Lambda functions or other Step Functions task resources add custom annotations containing user identifiers, request parameters, or other personal data, that data is stored in X-Ray under US jurisdiction.

GDPR Risk Surface: Three Articles, One Architecture

Art.17 (Right to Erasure) — the execution history problem. When an EU user submits a GDPR erasure request, your application must delete their personal data across all systems where it is stored. For applications using Step Functions, "all systems" includes the execution history for every workflow that processed data related to that user.

Step Functions provides no built-in mechanism to delete a specific execution from execution history. You can delete an entire state machine (which removes all associated execution history), but you cannot selectively delete the executions for a specific user while preserving executions for other users. The AWS SDK does not expose a DeleteExecution API for Standard Workflows.

The practical implication: if a user submits an erasure request and your order processing workflow was built on Step Functions, their order history — stored as workflow execution history — cannot be deleted from Step Functions without deleting all executions. You can delete the user's record from your DynamoDB table, but the Step Functions execution that processed their order, containing their name, address, and order details as step input/output, remains in Step Functions for up to 90 days (or up to 1 year if retention is configured).

This creates a structural gap between your application-level data deletion and the underlying AWS service's data retention. For controllers, Art.17 compliance requires a complete, verifiable erasure across all processing systems. Step Functions' execution retention makes verifiable erasure technically impossible within the retention window.

Art.5(1)(e) (Storage Limitation) — retention without control. GDPR's storage limitation principle requires that personal data is kept "for no longer than is necessary" for the purposes for which it was processed. An order workflow's execution history is retained for debugging and audit purposes — purposes that typically do not require 90 days of full input/output data retention for every execution.

Step Functions does not provide workflow-level retention configuration for Standard Workflows. Retention applies to the account or execution list broadly, not to individual workflows or execution types. An application cannot configure: "keep executions from monitoring workflows for 90 days, but delete executions from order-processing workflows after 7 days." The lack of granular retention control makes implementing storage limitation for personal data in Step Functions execution history technically complex — requiring custom cleanup Lambda functions, scheduled deletion jobs, and verification logic that AWS does not provide natively.

Art.28 (Data Processor Agreements) — jurisdiction mismatch. AWS offers a Data Processing Agreement (DPA) and EU Standard Contractual Clauses (SCCs) for EU customers. However, these contractual instruments address the mechanism of data transfers, not the compellability of the data controller (AWS's US parent corporation). SCCs do not bind US law enforcement. A CLOUD Act order served to AWS would supersede the contractual protections offered by the DPA and SCCs, because the CLOUD Act operates at the jurisdiction level — not the contract level.

The Schrems II ruling (C-311/18) established that contractual instruments cannot cure the structural incompatibility created by surveillance laws that allow mass access to data without effective judicial remedy for EU data subjects. Step Functions execution data stored under AWS US jurisdiction is subject to this structural incompatibility regardless of the DPA in place.

EU-Native Workflow Orchestration Alternatives

EU applications have several mature, production-ready workflow orchestration options that operate entirely under EU jurisdiction.

Temporal.io (Self-Hosted on EU Infrastructure)

Temporal is the most direct functional equivalent to Step Functions. It provides durable workflow orchestration with automatic retry, compensation logic, and long-running process support. Temporal originated from Uber's Cadence project and is now developed by Temporal Technologies as open-source software.

Temporal's key advantage for GDPR compliance is that it is infrastructure-agnostic: you deploy the Temporal Server on infrastructure you control. Running Temporal on Hetzner Cloud (Nuremberg, Falkenstein, Helsinki) or Scaleway (Paris, Amsterdam) places all workflow state, execution history, and input/output data under EU jurisdiction.

Temporal architecture:

For GDPR erasure compliance, Temporal provides workflow visibility APIs that allow querying and terminating workflows by workflow ID. Combined with a custom cleanup process, you can locate all workflows associated with a specific user identifier and terminate or signal them — something Step Functions' API cannot accomplish.

// Temporal worker — EU data residency (Hetzner/Scaleway)
import { Worker } from "@temporalio/worker";
import { Client, Connection } from "@temporalio/client";

// Connect to self-hosted Temporal on Hetzner
const connection = await Connection.connect({
  address: "temporal.internal.your-eu-app.com:7233",
});

// Order processing workflow — all state stays on EU infrastructure
export async function orderFulfillmentWorkflow(
  orderId: string,
  customerId: string
): Promise<FulfillmentResult> {
  // Input/output never leaves EU-controlled Temporal server
  await validatePayment(orderId);
  await reserveInventory(orderId);
  await createShipment(orderId);
  await notifyCustomer(customerId, orderId);
  return { status: "fulfilled", orderId };
}

// GDPR Art.17 erasure: terminate all workflows for a user
const handle = client.workflow.getHandle(`order-${customerId}-*`);
await handle.terminate("GDPR erasure request");

Hetzner deployment for Temporal:

# docker-compose.yml on Hetzner CPX21 (Nuremberg)
services:
  temporal:
    image: temporalio/auto-setup:1.24
    environment:
      - DB=postgres12
      - DB_PORT=5432
      - POSTGRES_USER=temporal
      - POSTGRES_PWD=${TEMPORAL_DB_PASSWORD}
      - POSTGRES_SEEDS=postgres
    ports:
      - "7233:7233"
    depends_on:
      - postgres

  postgres:
    image: postgres:16-alpine
    environment:
      POSTGRES_PASSWORD: ${TEMPORAL_DB_PASSWORD}
    volumes:
      - temporal-postgres:/var/lib/postgresql/data
    # EU data residency: Hetzner Nuremberg

Temporal Cloud EU region: Temporal Technologies also offers Temporal Cloud with a dedicated EU region (Frankfurt). This is a managed option if you prefer not to operate Temporal infrastructure yourself. Temporal Cloud is an entity incorporated in the US, so evaluate your CLOUD Act exposure as you would for any US-parented managed service. For maximum data sovereignty, self-hosted on EU infrastructure is the recommended approach.

Apache Airflow (Self-Hosted or Astronomer EU)

Apache Airflow is the de facto standard for data pipeline orchestration and batch workflow scheduling. If your Step Functions use case is primarily scheduled data processing — ETL pipelines, report generation, data synchronization jobs — Airflow is a mature, widely-adopted alternative with extensive EU deployment options.

Airflow uses Python-based DAGs (Directed Acyclic Graphs) to define workflows. The Airflow metadata database (PostgreSQL or MySQL) stores task execution state, logs, and configuration. Deployed on EU infrastructure, all workflow state remains under EU jurisdiction.

# Airflow DAG — EU data pipeline (Hetzner/Scaleway)
from airflow import DAG
from airflow.operators.python import PythonOperator
from datetime import datetime, timedelta

def process_eu_user_data(**context):
    # All execution context stored in Airflow metadata DB (EU)
    user_batch = context['dag_run'].conf.get('user_batch')
    # Process — logs and XCom stay on EU Airflow instance
    return {"processed": len(user_batch), "status": "ok"}

with DAG(
    "eu_user_data_pipeline",
    schedule_interval="@hourly",
    start_date=datetime(2026, 1, 1),
    catchup=False,
    default_args={"retries": 3, "retry_delay": timedelta(minutes=5)},
) as dag:
    task = PythonOperator(
        task_id="process_eu_data",
        python_callable=process_eu_user_data,
    )

Astronomer offers managed Airflow with an EU region (AWS eu-central-1 Frankfurt). This is a US-parented company hosting on AWS, which carries the same CLOUD Act exposure as native AWS Step Functions — evaluate accordingly. Self-hosted Airflow on Hetzner is the path to genuine EU data residency for Airflow workloads.

Windmill (Open-Source, EU-Native)

Windmill is a modern open-source workflow automation platform that supports TypeScript, Python, Go, and Bash scripts as workflow steps. It provides a web UI for building and monitoring workflows and is well-suited for internal tooling, automation workflows, and lightweight orchestration use cases.

Windmill is developed by a European company (based in Paris) and is available as open-source software deployable on any EU infrastructure. The Windmill Cloud offering runs on AWS eu-west-3 (Paris) and is operated by a French entity — reducing CLOUD Act exposure compared to AWS-native services, though a comprehensive analysis of the entity structure is recommended for high-sensitivity deployments.

// Windmill flow step — TypeScript on EU Windmill instance
export async function processOrderStep(
  order: Order
): Promise<ProcessingResult> {
  // Windmill stores step I/O in its PostgreSQL DB (EU infrastructure)
  const payment = await verifyPayment(order.paymentId);
  const inventory = await reserveInventory(order.items);
  return {
    paymentStatus: payment.status,
    reservationId: inventory.reservationId,
  };
}

Self-hosted Windmill on Hetzner:

# Hetzner CX31, Frankfurt — all workflow state EU-resident
git clone https://github.com/windmill-labs/windmill
cd windmill
# Configure postgres connection (Hetzner Managed Database)
docker compose up -d

n8n (Self-Hosted, EU Workflow Automation)

n8n is a visual workflow automation platform particularly well-suited for integration workflows — connecting APIs, processing webhooks, automating notifications, and coordinating data flows between services. If your Step Functions implementations are primarily service integration workflows (call API A → transform → call API B → notify), n8n provides a low-code visual alternative.

n8n is open-source (Apache 2.0 / Sustainable Use License) and can be self-hosted on any infrastructure. The n8n GmbH entity is incorporated in Germany, and n8n Cloud uses AWS eu-central-1 (Frankfurt). For self-hosted deployments on Hetzner or Scaleway, all workflow definitions, execution history, and credential data remain under EU jurisdiction.

n8n provides workflow execution log retention configuration per workflow — addressing the storage limitation requirement that Step Functions cannot meet. Individual executions can be set to auto-prune after configurable periods, and n8n's REST API allows deleting specific executions programmatically to support Art.17 erasure workflows.

Prefect (Self-Hosted, EU)

Prefect is a Python-native workflow orchestration platform aimed at data engineering and ML pipeline use cases. Prefect supports both local and distributed execution with a modern API and observability features comparable to Step Functions.

Prefect Cloud offers a US-hosted managed option. For EU data residency, Prefect Server (self-hosted) on Hetzner or Scaleway provides full control over execution metadata storage. Prefect's architecture separates the API server (metadata storage) from the execution layer (compute), allowing you to run the Prefect Server on a single EU-resident instance while scaling out workers across multiple compute nodes.

Migration Pattern: Step Functions to Temporal

For EU applications currently using Step Functions for order or user data workflows, Temporal is the most architecturally equivalent replacement. The migration pattern maps Step Functions concepts to Temporal equivalents:

Step Functions ConceptTemporal Equivalent
State MachineWorkflow Definition
State (Task, Choice, Wait)Activity, Workflow Signal/Query, sleep()
Standard WorkflowWorkflow with persistent state
Express WorkflowShort-lived workflow or Activity
Execution HistoryWorkflow History (stored in Temporal Server DB)
Activity HeartbeatActivity Heartbeat
Retry Policy (ErrorEquals)Activity Retry Policy (MaxAttempts, NonRetryableErrorTypes)
Catch (ErrorEquals)Try/catch in workflow code
Map StatePromise.all() or workflow.executeActivity() in parallel
Wait for Task Tokenworkflow.waitCondition() or Signal

Migration steps:

  1. Deploy Temporal Server on Hetzner (CPX31 for medium workloads, dedicated server for high-volume)
  2. Identify all Step Functions state machines and their execution volumes
  3. Rewrite state machines as Temporal workflows (TypeScript SDK or Python SDK)
  4. Implement GDPR cleanup signal in each workflow handling personal data: workflow.setHandler(erasureSignal, () => { /* delete user data */ workflow.continue(); })
  5. Configure Temporal workflow retention to match your DSGVO data minimization requirements
  6. Run parallel — execute both Step Functions and Temporal in parallel, validating equivalence
  7. Cut over and stop creating new Step Functions executions
  8. Allow Step Functions retention window to expire (90 days for existing executions)

Checklist: Is Your Step Functions Usage GDPR-Compliant?

Before evaluating alternatives, assess your current exposure:

EU-Native Alternatives: Quick Comparison

AlternativeLicenseEU HostingCLOUD Act RiskBest For
Temporal (self-hosted)MITHetzner, ScalewayNoneComplex long-running workflows, order processing
Apache Airflow (self-hosted)Apache 2.0Hetzner, ScalewayNoneScheduled data pipelines, ETL
Windmill (self-hosted)Open-sourceHetzner, ScalewayNoneMixed script + API workflows
n8n (self-hosted)Open-sourceHetzner, ScalewayNoneIntegration automation, webhooks
Prefect Server (self-hosted)Apache 2.0Hetzner, ScalewayNoneData engineering, ML pipelines

Conclusion

AWS Step Functions is a capable workflow orchestration service with a significant GDPR liability for EU applications: its execution history stores the complete input/output payload for every workflow step, retains it for up to 90 days or 1 year, provides no mechanism to selectively delete executions for a specific user, and operates under the jurisdiction of a US corporation subject to CLOUD Act compulsion.

For EU applications processing personal data through orchestrated workflows — orders, user onboarding, document processing, financial transactions — Step Functions creates a structural gap between GDPR Art.17 compliance commitments and technical implementation. The execution history that makes Step Functions debuggable is also the audit trail of your users' personal data, retained beyond application-level deletion, under US-controlled infrastructure.

Temporal (self-hosted on Hetzner), n8n, Windmill, and Apache Airflow provide workflow orchestration capabilities that match Step Functions' core use cases. Deployed on EU infrastructure, they deliver the workflow visibility and reliability that Step Functions offers without the jurisdiction exposure.

The underlying constraint is consistent across the AWS Messaging and Orchestration suite — SQS, SNS, EventBridge, and now Step Functions: every managed service that stores application state under an American corporation's control is subject to CLOUD Act compulsion, regardless of data center location. For EU applications that need to demonstrate genuine data sovereignty, the path is EU-native infrastructure where the controlling entity is not subject to US jurisdiction.


sota.io is a European PaaS platform built for EU data sovereignty. Deploy your entire application stack — including workflow workers — on EU infrastructure with no US-parent exposure. Start free →

EU-Native Hosting

Ready to move to EU-sovereign infrastructure?

sota.io is a German-hosted PaaS — no CLOUD Act exposure, no US jurisdiction, full GDPR compliance by design. Deploy your first app in minutes.