Deploy Rust to Europe — EU Hosting for Axum, Actix-web, and Warp APIs
Rust has moved firmly into production web development. Frameworks like Axum, Actix-web, and Warp handle hundreds of thousands of requests per second on a single core — with memory safety guarantees that eliminate entire classes of security vulnerabilities. For EU developers building high-performance APIs, Rust is increasingly the first choice.
But the same question arises as with any backend: where does the data live? EU companies processing personal data have a legal obligation to ensure that data stays within the EU, or that adequate safeguards are in place. Deploying to US infrastructure creates GDPR exposure that no performance benchmark can offset.
This guide shows you how to deploy a Rust web application to European servers using sota.io — GDPR-compliant by default, with managed PostgreSQL and zero DevOps overhead.
Why Rust for EU APIs?
Rust has found a strong foothold in:
- High-performance REST APIs — sub-millisecond latency, minimal memory footprint
- WebAssembly backends — WASM modules for edge logic and plugin systems
- CLI tools deployed as services — Rust binaries as microservices
- AI inference backends — calling LLM APIs, processing embeddings, serving vector results
- Security-critical services — cryptographic operations, token validation, authentication
Many of these services handle personal data. A Rust authentication service processes session tokens. A Rust API logs IP addresses and request metadata. Under GDPR, these are personal data — and where they are stored matters.
The straightforward solution: deploy to EU infrastructure. sota.io runs on Hetzner Cloud in Germany. Your Rust binary runs there. Your PostgreSQL database lives there. Your data never leaves the EU.
Deploying a Rust Application to sota.io
Option 1: Dockerfile (recommended for Rust)
Rust compilation takes several minutes, so a multi-stage Dockerfile is the right approach. The builder stage compiles your binary; the runtime stage runs it in a minimal image:
FROM rust:1.78-slim AS builder
WORKDIR /app
COPY Cargo.toml Cargo.lock ./
# Pre-build dependencies for layer caching
RUN mkdir src && echo "fn main() {}" > src/main.rs && cargo build --release && rm -rf src
COPY src ./src
RUN touch src/main.rs && cargo build --release
FROM debian:bookworm-slim
RUN apt-get update && apt-get install -y ca-certificates libssl3 && rm -rf /var/lib/apt/lists/*
WORKDIR /app
COPY --from=builder /app/target/release/myapp .
EXPOSE 8080
CMD ["./myapp"]
The dependency pre-build trick dramatically speeds up rebuilds — Cargo only recompiles your source code when it changes, not all dependencies.
Deploy with the sota.io CLI:
curl -fsSL https://sota.io/install.sh | sh
sota deploy
Your Rust API gets a live HTTPS URL at your-app.sota.io in under 90 seconds (including compilation).
Building for Linux (cross-compilation)
If you develop on macOS and your CI is Linux, ensure your Dockerfile targets the correct platform:
FROM --platform=linux/amd64 rust:1.78-slim AS builder
sota.io runs on linux/amd64 — this ensures your binary compiles correctly regardless of your local architecture.
Connecting to PostgreSQL with sqlx
Every sota.io project includes managed PostgreSQL 17. The DATABASE_URL environment variable is auto-injected at runtime.
Using sqlx (the idiomatic async Rust choice):
# Cargo.toml
[dependencies]
axum = "0.7"
sqlx = { version = "0.7", features = ["runtime-tokio", "postgres", "macros"] }
tokio = { version = "1", features = ["full"] }
serde = { version = "1", features = ["derive"] }
serde_json = "1"
use axum::{extract::State, response::Json, routing::get, Router};
use sqlx::PgPool;
use serde::Serialize;
use std::env;
#[derive(Serialize, sqlx::FromRow)]
struct User {
id: i32,
email: String,
}
#[derive(Clone)]
struct AppState {
db: PgPool,
}
#[tokio::main]
async fn main() {
let database_url = env::var("DATABASE_URL").expect("DATABASE_URL must be set");
let pool = PgPool::connect(&database_url).await.expect("Failed to connect to database");
// Run migrations on startup
sqlx::migrate!("./migrations").run(&pool).await.expect("Migration failed");
let state = AppState { db: pool };
let app = Router::new()
.route("/health", get(health))
.route("/users", get(list_users))
.with_state(state);
let port = env::var("PORT").unwrap_or_else(|_| "8080".to_string());
let listener = tokio::net::TcpListener::bind(format!("0.0.0.0:{}", port)).await.unwrap();
println!("Listening on port {}", port);
axum::serve(listener, app).await.unwrap();
}
async fn health() -> &'static str {
"ok"
}
async fn list_users(State(state): State<AppState>) -> Json<Vec<User>> {
let users = sqlx::query_as::<_, User>("SELECT id, email FROM users LIMIT 20")
.fetch_all(&state.db)
.await
.unwrap_or_default();
Json(users)
}
Database Migrations with sqlx
Place your SQL migrations in ./migrations/:
-- migrations/20260401_create_users.sql
CREATE TABLE IF NOT EXISTS users (
id SERIAL PRIMARY KEY,
email TEXT NOT NULL UNIQUE,
created_at TIMESTAMPTZ DEFAULT NOW()
);
sqlx::migrate! runs pending migrations on every startup. sota.io containers start fresh on each deploy — migrations are the reliable way to keep your schema in sync.
Actix-web Alternative
If you prefer Actix-web for its actor model and mature ecosystem:
use actix_web::{web, App, HttpServer, HttpResponse};
use sqlx::PgPool;
use std::env;
#[actix_web::main]
async fn main() -> std::io::Result<()> {
let database_url = env::var("DATABASE_URL").expect("DATABASE_URL must be set");
let pool = PgPool::connect(&database_url).await.expect("DB connection failed");
let port = env::var("PORT").unwrap_or_else(|_| "8080".to_string());
let bind_addr = format!("0.0.0.0:{}", port);
HttpServer::new(move || {
App::new()
.app_data(web::Data::new(pool.clone()))
.route("/health", web::get().to(HttpResponse::Ok))
})
.bind(&bind_addr)?
.run()
.await
}
Both Axum and Actix-web work with sota.io without any platform-specific configuration.
sota.io vs. Other Platforms for Rust
| sota.io | Railway | Render | Fly.io | |
|---|---|---|---|---|
| EU data residency | Germany (default) | US default | US default | EU region (extra config) |
| GDPR-compliant | Yes | Requires DPA | Requires DPA | Requires DPA |
| Rust + Dockerfile | Yes | Yes | Yes | Yes |
| Managed PostgreSQL | Included, free | Add-on, $5/mo | Add-on, $7/mo | Managed via Fly Postgres |
| Build caching | Layer cache (Docker) | Layer cache | Layer cache | Layer cache |
| Pricing | Flat €9/mo | Usage-based | Usage-based | Usage-based |
The critical distinction: on sota.io, EU data residency is the default. On other platforms, it requires selecting a region, configuring a DPA, and verifying that all data stores — including logs and backups — remain in Europe.
GDPR Considerations for Rust APIs
Rust APIs commonly handle data that falls under GDPR:
- Request logs — IP addresses, user agents, timestamps are personal data
- JWT tokens — contain user identifiers
- Database records — user emails, preferences, activity data
GDPR Article 3 has extra-territorial scope: if your users are in the EU, GDPR applies regardless of where your company is based.
Practical compliance checklist for Rust APIs on sota.io:
- Deploy to EU infrastructure — sota.io Germany (this guide)
- Sign the Data Processing Agreement with sota.io
- Use structured logging — avoid logging raw request bodies that may contain personal data
- Configure
tracingsubscriber to omit PII fields - Use sota.io's managed PostgreSQL — backups stay in Germany, encryption at rest included
Summary
Deploying Rust to Europe with sota.io:
- Write an Axum or Actix-web app that listens on
$PORT - Add a multi-stage
Dockerfile - Run
sota deploy - Get a live HTTPS URL at
{slug}.sota.ioin under 90 seconds DATABASE_URLis auto-injected — PostgreSQL 17 ready immediately- EU data residency by default — GDPR-compliant out of the box
Rust's performance advantages are preserved. No platform-specific runtime. No managed language overhead. Your binary runs directly — in Germany.
Start deploying: sota.io →
See also: Deploy Go to Europe · Deploy Python to Europe · Deploy an AI Agent to Europe