2026-05-01·12 min read·

AWS Elemental MediaConvert EU Alternative 2026: Video Transcoding, Medical Imaging, and the GDPR CLOUD Act Problem

Post #753 in the sota.io EU Compliance Series

AWS Elemental MediaConvert is Amazon's file-based video transcoding service. It converts source video files into broadcast and streaming formats: HLS for adaptive bitrate streaming, DASH for browser-based playback, MP4 for download delivery, and dozens of broadcast-specific containers. MediaConvert handles the transcoding jobs that power video-on-demand platforms, e-learning libraries, telehealth archives, and media broadcasting workflows across European enterprises.

Amazon operates MediaConvert in European regions: eu-west-1 (Ireland), eu-central-1 (Frankfurt), eu-west-3 (Paris). The transcoding compute runs in Europe. The output video files are stored wherever you configure — typically an S3 bucket in your chosen region. Many development teams treat this as a GDPR-compliant configuration.

It is not. Amazon Web Services, Inc. is a Delaware corporation headquartered in Seattle, Washington. The CLOUD Act (18 U.S.C. § 2713) compels US companies to produce data stored anywhere in the world when ordered by US authorities. A valid government order served on Amazon in Seattle can reach your MediaConvert job templates in Frankfurt, your transcoding logs in CloudWatch, and your output manifests in S3 — all managed by the US legal entity regardless of which AWS region hosts the compute.

This is the same structural US jurisdiction problem documented across the AWS stack: AWS Transcribe, AWS Textract, AWS Rekognition. Video transcoding adds a particularly sensitive dimension: when the source material is healthcare video — surgical recordings, radiology screenings, telehealth session recordings, patient education videos — you are processing special category data under GDPR Art.9 through an AWS-managed service under US legal jurisdiction.

What AWS Elemental MediaConvert Stores About Your Video Processing

MediaConvert is not a stateless transcoding engine. It maintains substantial operational data around every job, queue, template, and preset — and much of that data directly describes the personal data it processes.

Job Templates as Art.30 Processing Activity Records

MediaConvert job templates are reusable transcoding configurations stored persistently in the AWS MediaConvert service. A job template specifies:

Job templates are stored as named resources in the MediaConvert service under your AWS account — managed by the US legal entity. For healthcare workflows, a template named telehealth-patient-session-720p or radiology-archive-lossless documents the existence and nature of the processing activity. Under GDPR Art.30, organizations must maintain records of processing activities; the irony is that these records now exist in a system where they are accessible under CLOUD Act compulsion.

When MediaConvert templates reference S3 input paths like s3://patient-videos/oncology/2026/patient-{id}/session-{date}/, the template definition itself becomes a data architecture document — one stored by a US company under US law.

Transcoding Job Logs as Art.9 Special Category Evidence

Every MediaConvert transcoding job generates execution logs published to Amazon CloudWatch. These logs contain:

For healthcare video processing, these logs are structured evidence of Art.9 special category data handling. A CloudWatch log entry reading:

INFO: Processing input s3://clinic-videos/dermatology/patient-DE12345/consultation-20260501/recording.mp4
INFO: Output written to s3://cdn-archive/processed/patient-DE12345/recording_720p.m3u8

This log entry documents the processing of healthcare video content linked to an identifiable patient. It exists in CloudWatch — an AWS service under US legal jurisdiction — for the default retention period of never (CloudWatch logs do not expire unless you configure a retention policy). AWS guidance recommends keeping transcoding logs for operational diagnostics. Under GDPR Art.9(2), processing special category data requires explicit consent or another enumerated legal basis; the basis does not extend to retaining processing evidence in a US-controlled log system indefinitely.

HLS and DASH Output Manifests as Art.17 Erasure Gap

When MediaConvert produces adaptive bitrate streaming output, it generates manifest files: .m3u8 for HLS, .mpd for DASH. These manifests are the index of the transcoded video — they list every segment file, their durations, their quality variants, and their locations.

For a healthcare video archive processed through MediaConvert:

#EXTM3U
#EXT-X-VERSION:3
#EXT-X-STREAM-INF:BANDWIDTH=4000000,RESOLUTION=1920x1080
patient-session-1080p/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1500000,RESOLUTION=1280x720
patient-session-720p/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=600000,RESOLUTION=854x480
patient-session-480p/index.m3u8

The manifest is a structured pointer to personal data. Under GDPR Art.17 (right to erasure), when a patient requests deletion of their video records, organizations must delete not only the source video and transcoded segments, but also the manifests that reference them. Organizations that delete segment files while leaving manifests intact create broken pointers — but they also leave behind a structured document that proves the video existed, its quality variants, and its segment organization.

MediaConvert creates these manifests as standard output. They are stored in S3 (under US legal jurisdiction if using AWS S3). Even if you move to EU-sovereign storage, MediaConvert's job logs in CloudWatch still reference the manifest paths — creating a shadow record of what was processed and where it was stored.

Watermark Configurations as Art.25 Privacy-by-Design Failure

MediaConvert supports watermarking through the Nielsen watermarking system and through image overlay and text overlay via the imageInserter setting. Watermark configurations stored in MediaConvert job templates and presets can include:

When watermarks embed viewer or employee identifiers into video output for tracking or rights management purposes, the watermark configuration in MediaConvert service state links specific identifiers to specific video content. For organizations using dynamic watermarking (embedding different identifiers per viewer session), MediaConvert job definitions become a registry of who received which watermarked copy.

Under GDPR Art.25 (data protection by design and by default), watermarking configurations that embed personal identifiers should be designed to minimize personal data in service-managed state. Storing dynamic watermark parameters — linking viewer identities to content copies — in an AWS-managed job template violates the Art.25 principle that personal data be processed with the minimum personal data necessary.

IAM Role Definitions as Art.32 Security Architecture Evidence

MediaConvert operates using an IAM service role that grants it access to your S3 buckets. This service role configuration:

For healthcare organizations, the IAM role definition is a security architecture document: it describes exactly which data stores MediaConvert can access, what encryption keys it can use, and what operations it can perform. This configuration exists in AWS IAM — managed by the US legal entity.

Under GDPR Art.32, organizations must implement appropriate technical measures to ensure security appropriate to the risk. An IAM role granting a US-controlled service access to healthcare video storage is a security boundary that does not fully remain within EU legal jurisdiction. A government compelled disclosure order could, in theory, compel AWS to exercise its administrative access to extract information about the IAM configuration, the bucket access patterns, and the volumes of video processed.

Reserved Queue Configurations as Art.28 Processor Evidence

MediaConvert offers two queue types: on-demand queues (shared capacity, pay-per-use) and reserved queues (dedicated capacity, monthly commitment). Reserved queue configurations document:

Reserved queues are purchased capacity commitments registered as resources in the MediaConvert service. The queue name, capacity, and usage patterns document your video processing volume and the nature of the workload. For healthcare organizations, a reserved queue named for the clinical use case documents the processing relationship between your organization and AWS.

Under GDPR Art.28, when a controller uses a processor (AWS as MediaConvert processor), there must be a Data Processing Agreement. AWS's standard DPA covers the general processor relationship, but the specific evidence of processing — including queue configurations documenting healthcare video volume — is held in AWS service state under US jurisdiction. Art.28(3)(h) requires processors to make available all information necessary to demonstrate compliance; information held under a US legal framework that permits government compelled disclosure without notice creates a structural tension with this requirement.

What GDPR Requires for Video Transcoding of Special Category Data

When source video contains special category data under Art.9 — healthcare recordings, footage of medical procedures, telehealth sessions, disability-related content — the GDPR requirements for the transcoding infrastructure are heightened:

Art.9(1) prohibition and Art.9(2) exceptions: Processing special category data is prohibited unless one of the Art.9(2) exceptions applies. For healthcare video, the most common basis is Art.9(2)(h) (medical purposes, healthcare management) combined with Art.9(3) (processing by healthcare professionals under national law). These bases authorize the medical treatment; they do not authorize processing by a US technology company under US legal jurisdiction.

Art.32 appropriate technical measures: Security measures must be appropriate to the risk. For Art.9 special category data, the risk level is high. Processing healthcare video through infrastructure where a US government order can compel access without notice to the data subject fails the Art.32 risk assessment for many DPAs in Germany, France, and the Netherlands.

Art.44-49 transfer rules: If CLOUD Act compelled disclosure constitutes a data transfer to a third country (the US), it falls outside Art.44-49 transfer mechanisms. The data subject has not consented to disclosure to US authorities; the transfer is compelled, not contractual. This is the legal gap that makes SCC-based transfer frameworks insufficient for CLOUD Act exposure.

Schrems II implications: The CJEU's Schrems II decision (C-311/18) established that SCCs do not automatically authorize transfers where the legal framework of the third country does not ensure adequate protection. US intelligence surveillance law — FISA Section 702, EO 12333 — was the basis for Schrems II. The CLOUD Act is a different statutory authority with the same structural outcome: US legal jurisdiction reaching data in European AWS regions.

EU-Native Video Transcoding Alternatives

FFmpeg on EU Infrastructure

FFmpeg is the foundation of virtually all video transcoding — including AWS MediaConvert, which uses FFmpeg internally. Running FFmpeg directly on EU-sovereign compute eliminates all service-state exposure:

# Install FFmpeg on Hetzner CX21 (EU-based)
apt-get install -y ffmpeg

# Create HLS adaptive bitrate output (equivalent to MediaConvert HLS preset)
ffmpeg -i input.mp4 \
  -map 0:v -map 0:a -map 0:v -map 0:a -map 0:v -map 0:a \
  -s:v:0 1920x1080 -b:v:0 4000k \
  -s:v:1 1280x720 -b:v:1 1500k \
  -s:v:2 854x480 -b:v:2 600k \
  -c:v libx264 -c:a aac \
  -var_stream_map "v:0,a:0 v:1,a:1 v:2,a:2" \
  -master_pl_name master.m3u8 \
  -hls_time 6 -hls_list_size 0 \
  -hls_segment_filename "output_%v/segment_%03d.ts" \
  output_%v/index.m3u8

No job templates stored in external service state. No CloudWatch logs. No IAM roles granting external service access to your buckets. The transcoding logic lives in your infrastructure, committed to your version control system.

For job orchestration, use a simple task queue (Celery + Redis on EU infrastructure, or a PostgreSQL-backed job table) instead of MediaConvert's queuing system:

# celery_tasks.py — EU-sovereign transcoding queue
import subprocess
from celery import Celery

app = Celery('transcoding', broker='redis://localhost:6379/0')

@app.task
def transcode_to_hls(input_path: str, output_dir: str, job_id: str):
    cmd = [
        'ffmpeg', '-i', input_path,
        '-map', '0:v', '-map', '0:a',
        '-s:v', '1280x720', '-b:v', '1500k',
        '-c:v', 'libx264', '-c:a', 'aac',
        '-hls_time', '6', '-hls_list_size', '0',
        '-hls_segment_filename', f'{output_dir}/segment_%03d.ts',
        f'{output_dir}/index.m3u8'
    ]
    result = subprocess.run(cmd, capture_output=True, text=True)
    # Log locally — no CloudWatch, no external service state
    with open(f'/var/log/transcoding/{job_id}.log', 'w') as f:
        f.write(result.stdout + result.stderr)
    return result.returncode

All job logs stay on your EU infrastructure. Retention is controlled by your log rotation policy. Right-to-erasure means deleting local files — no shadow records in AWS service state.

Shaka Packager for DASH Packaging

For DASH streaming output (common in enterprise video platforms), Shaka Packager provides MediaConvert-equivalent packaging on EU infrastructure:

# Install Shaka Packager
wget https://github.com/google/shaka-packager/releases/latest/download/packager-linux-x64
chmod +x packager-linux-x64

# Package pre-transcoded video into DASH + HLS (MediaConvert equivalent)
./packager-linux-x64 \
  'in=video_1080p.mp4,stream=video,output=video_1080p_dash.mp4' \
  'in=video_720p.mp4,stream=video,output=video_720p_dash.mp4' \
  'in=audio.mp4,stream=audio,output=audio_dash.mp4' \
  --mpd_output manifest.mpd \
  --hls_master_playlist_output master.m3u8

Shaka Packager runs as a stateless binary. No service registration, no cloud account, no external job state. It can be containerized and deployed on any EU compute provider: Hetzner, OVHcloud, IONOS, Scaleway.

HandBrake CLI for Batch Processing

For organizations with simpler transcoding requirements — converting uploaded videos to a standard format without adaptive bitrate — HandBrake CLI offers a straightforward MediaConvert alternative:

# HandBrake batch transcoding on EU server
for input_file in /incoming/*.mp4; do
    filename=$(basename "$input_file" .mp4)
    HandBrakeCLI \
        --input "$input_file" \
        --output "/processed/${filename}_720p.mp4" \
        --preset "Fast 720p30" \
        --encoder x264 \
        --quality 22
done

HandBrake is particularly suitable for healthcare video archives where the requirement is format normalization rather than adaptive streaming: converting DICOM-adjacent video formats, surgical recording formats (MPEG-2, H.264 from OR cameras), and telehealth recording exports.

Jellyfish / Jellyfin for Healthcare Video Archives

For organizations building patient video archives with playback capability — the complete MediaConvert + CloudFront + S3 stack — Jellyfin provides a self-hostable alternative:

# docker-compose.yml — Jellyfin on Hetzner (EU-sovereign)
version: '3'
services:
  jellyfin:
    image: jellyfin/jellyfin
    container_name: jellyfin
    volumes:
      - /opt/jellyfin/config:/config
      - /opt/jellyfin/cache:/cache
      - /mnt/patient-video:/media:ro
    ports:
      - "8096:8096"
    environment:
      - JELLYFIN_PublishedServerUrl=https://video.your-clinic.eu
    restart: unless-stopped

Jellyfin handles transcoding on-the-fly during playback and creates its own HLS segments without MediaConvert. All transcoding happens on your EU server. No job templates, no external queues, no CloudWatch logs.

For healthcare compliance, Jellyfin's access logs stay on your server and can be configured with your retention policy. Combining Jellyfin with Keycloak for authentication and MinIO for storage creates a fully EU-sovereign video archive stack.

GStreamer for Real-Time Streaming

For telehealth platforms requiring real-time transcoding (live session recording and simultaneous playback), GStreamer provides a MediaConvert-equivalent pipeline on EU infrastructure:

# gstreamer_pipeline.py — Real-time HLS output (EU-sovereign)
import gi
gi.require_version('Gst', '1.0')
from gi.repository import Gst, GLib

Gst.init(None)

pipeline = Gst.parse_launch("""
    filesrc location=/incoming/session.mp4 !
    decodebin !
    videoconvert !
    x264enc bitrate=1500 !
    mpegtsmux !
    hlssink
        location=/output/segment%05d.ts
        playlist-location=/output/index.m3u8
        target-duration=6
""")

pipeline.set_state(Gst.State.PLAYING)
loop = GLib.MainLoop()
loop.run()

GStreamer runs entirely within your EU infrastructure. No AWS service calls. No external job registration. Pipeline configuration is code in your repository, not a named resource in an AWS account.

Migration Guide: From AWS MediaConvert to EU-Sovereign Transcoding

Step 1: Audit your MediaConvert job templates

Export all existing job templates:

aws mediaconvert list-job-templates --region eu-central-1 \
  --query 'JobTemplates[*].{Name:Name,Settings:Settings}' \
  > mediaconvert-templates-export.json

Review each template for:

Step 2: Convert MediaConvert presets to FFmpeg commands

Map MediaConvert codec settings to FFmpeg equivalents:

MediaConvert SettingFFmpeg Equivalent
H_264 + QVBR quality 8-c:v libx264 -crf 23
FRAME_RATE: 25-r 25
GOP_SIZE: 90-g 90
SCENE_CHANGE_DETECT: ENABLED-sc_threshold 0 (disable for predictable segments)
AAC, 192000 bitrate-c:a aac -b:a 192k
HLS_GROUP_SETTINGS, 6s segments-hls_time 6

Step 3: Replace MediaConvert queues with a local job queue

-- PostgreSQL job queue (replaces MediaConvert reserved queues)
CREATE TABLE transcoding_jobs (
    id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
    input_path TEXT NOT NULL,
    output_path TEXT NOT NULL,
    preset TEXT NOT NULL,
    status TEXT DEFAULT 'pending',
    created_at TIMESTAMPTZ DEFAULT NOW(),
    started_at TIMESTAMPTZ,
    completed_at TIMESTAMPTZ,
    log_path TEXT,
    error_message TEXT
);

CREATE INDEX ON transcoding_jobs(status, created_at);

This table runs on your EU PostgreSQL instance (Hetzner, OVHcloud, or any EU provider). No AWS account needed. Retention is controlled by your data retention policy. Right-to-erasure means a DELETE statement.

Step 4: Replace CloudWatch transcoding logs with local structured logging

# structured_logger.py — replaces CloudWatch MediaConvert logs
import json
import logging
from datetime import datetime, timezone

def log_transcoding_job(job_id: str, input_path: str, output_path: str, 
                         status: str, duration_seconds: float = None,
                         error: str = None):
    entry = {
        "timestamp": datetime.now(timezone.utc).isoformat(),
        "job_id": job_id,
        "input_path": input_path,
        "output_path": output_path,
        "status": status,
        "duration_seconds": duration_seconds,
        "error": error
    }
    # Log locally — configure retention in logrotate or PostgreSQL
    logging.getLogger('transcoding').info(json.dumps(entry))

Local structured logging provides the same operational diagnostics as CloudWatch without the cross-border data exposure.

Step 5: Delete MediaConvert resources after migration

After migrating to EU-sovereign transcoding, clean up MediaConvert resources to eliminate residual service-state exposure:

# Delete all job templates
aws mediaconvert list-job-templates --region eu-central-1 \
  --query 'JobTemplates[*].Name' --output text | \
  tr '\t' '\n' | while read name; do
    aws mediaconvert delete-job-template --name "$name" --region eu-central-1
    echo "Deleted template: $name"
done

# Delete presets
aws mediaconvert list-presets --region eu-central-1 \
  --query 'Presets[*].Name' --output text | \
  tr '\t' '\n' | grep -v "^System" | while read name; do
    aws mediaconvert delete-preset --name "$name" --region eu-central-1
    echo "Deleted preset: $name"
done

# Delete reserved queues (after commitment period expires)
aws mediaconvert list-queues --region eu-central-1 \
  --query 'Queues[?Type==`RESERVED`].Name' --output text | \
  tr '\t' '\n' | while read name; do
    echo "Reserved queue: $name — cancel at renewal"
done

The Healthcare Video Compliance Gap

The compliance gap for healthcare video on AWS MediaConvert is not theoretical — it is structural. Consider a German hospital operating a telehealth platform. Patient video sessions are recorded and archived for clinical continuity of care. The hospital uses MediaConvert to transcode recordings into browser-playable HLS format.

Under this architecture:

The German hospital is bound by DSGVO (GDPR implementation in German law), KHZG (Krankenhauszukunftsgesetz), and digital health platform requirements under the DiGA framework. The Bavarian Data Protection Authority (BayLDA) and the Berlin Commissioner for Data Protection and Freedom of Information have both issued guidance that US-controlled cloud services for health data require heightened scrutiny.

Under a CLOUD Act government order served on Amazon, all of the above — the templates, the logs, the manifests, the queue configuration — is accessible to US law enforcement without notification to the hospital or the patient. Art.9 special category data processed under these conditions does not meet the security standard that GDPR Art.32 requires for health data processing.

What EU-Native Infrastructure Changes

Moving video transcoding to EU-sovereign infrastructure changes the threat model:

FFmpeg on Hetzner: A German government order would need to go through Hetzner (a German company subject to German law, DSGVO, and the German telecommunications act). US authorities cannot directly compel Hetzner to disclose data. There is no CLOUD Act equivalent — US law does not apply to German companies.

Self-hosted Jellyfin on OVHcloud: OVHcloud is a French company. French surveillance law (Loi de programmation militaire, L. 246-1 CPCE) applies to French companies, not US law. A US CLOUD Act order cannot reach OVHcloud.

On-premises FFmpeg transcoding: No cloud provider involved. Government access requires a judicial order under applicable national law with notice to the organization. The Art.32 risk profile is fundamentally different.

EU-sovereign transcoding does not make video processing immune to government access — it ensures that government access follows EU legal procedures with EU procedural protections, rather than US procedures applied extraterritorially.

Summary

AWS Elemental MediaConvert creates six categories of GDPR-relevant service state under US legal jurisdiction:

  1. Job templates document input source paths and output destinations — often containing patient identifiers for healthcare workflows
  2. CloudWatch transcoding logs record every processing operation linked to source video files containing personal data
  3. HLS/DASH output manifests create persistent indexes of personal data locations that outlive Art.17 erasure requests
  4. Watermark configurations may link personal identifiers to specific video copies in AWS-managed job definitions
  5. IAM service role definitions document which data stores the US-controlled service can access
  6. Reserved queue configurations document processing volumes and clinical workflow organization in AWS service state

For healthcare video — telehealth recordings, surgical documentation, radiology, patient education — each of these exposure points implicates Art.9 special category data.

The EU-native alternative stack — FFmpeg on Hetzner or OVHcloud, Shaka Packager for DASH, Jellyfin for archive playback, PostgreSQL for job queuing, structured local logging — provides complete functional equivalence to the MediaConvert stack. No AWS account required. All operational data stays within EU legal jurisdiction. Right-to-erasure is a local DELETE statement, not a cross-service cleanup operation spanning AWS MediaConvert, CloudWatch, and S3.


Running video transcoding on EU-sovereign infrastructure? sota.io helps European development teams deploy containerized workloads — including FFmpeg transcoding pipelines — on EU-region infrastructure with no US cloud provider dependency. Try sota.io free.

EU-Native Hosting

Ready to move to EU-sovereign infrastructure?

sota.io is a German-hosted PaaS — no CLOUD Act exposure, no US jurisdiction, full GDPR compliance by design. Deploy your first app in minutes.