2026-04-27·8 min read·sota.io team

Deploy Fortran to Europe — EU Hosting for Scientific Computing Backends in 2026

The code that predicts tomorrow's weather across Europe is written in Fortran. The simulations modelling plasma inside ITER — humanity's attempt to build a working nuclear fusion reactor in the south of France — run in Fortran. The atmospheric models that power Copernicus, the European Union's Earth observation programme, are Fortran programs. When EU climate scientists need to run a century-scale climate simulation, they write it in Fortran and submit it to one of Europe's national supercomputing centres. The language is 69 years old and the European scientific community has never found a reason to replace it.

Fortran was created in 1957 at IBM by John Backus and his team — a genuinely American origin story. But the language's modern form and continuing relevance is substantially a European achievement. The Fortran standards that define the language developers use today — Fortran 90, 95, 2003, 2008, 2018, and 2023 — were shaped in large part by British, German, and Dutch contributors who understood what Europe's scientific computing institutions actually needed.

John Reid, a mathematician who spent his career at the Rutherford Appleton Laboratory (RAL) in Oxfordshire, UK, served as Convener of the ISO/IEC Fortran standards committee for decades. Reid's work on Fortran 90 introduced free-form source, array operations, modules, and derived types — the features that transformed Fortran from a legacy language into a modern one. He continued contributing through Fortran 2003 (object-oriented programming, C interoperability), Fortran 2008 (coarrays for parallel computing), Fortran 2018 (teams, collective subroutines), and Fortran 2023. The language Europeans run on petascale supercomputers today is, in significant part, Reid's design.

Malcolm Cohen of the Numerical Algorithms Group (NAG), based in Oxford, UK, has served as principal editor of the Fortran 2008 and Fortran 2018 standards. NAG was founded in 1970 as a collaboration between British universities to create numerical algorithms for scientific computing — it remains one of the most important institutions in European mathematical software. Cohen's editorial work ensures that the Fortran standard is precise, complete, and implementable. When compiler vendors in Germany or France implement a new Fortran feature, they are working from a document that Cohen and his Oxford colleagues produced.

Europe Runs on Fortran

The evidence that European scientific infrastructure is built on Fortran is not historical — it is operational today.

ECMWF (European Centre for Medium-Range Weather Forecasts) operates the world's best global weather forecasting model. Their IFS — the Integrated Forecasting System — is several million lines of Fortran accumulated over fifty years of continuous development. ECMWF was founded in 1975 and is funded by 35 European states. Their primary computing facility is at their data centre in Bologna, Italy — EU infrastructure, inside the EU, running Fortran. When Météo-France, the German DWD, or the Italian CMCC need to run ensemble forecasts, the code runs in Bologna. The Copernicus Climate Change Service and Copernicus Atmosphere Monitoring Service, both EU programmes, use ECMWF's Fortran infrastructure as their operational backbone.

DWD — the Deutscher Wetterdienst, Germany's national meteorological service, headquartered in Offenbach am Main — develops and maintains the ICON model (Icosahedral Nonhydrostatic). ICON is one of the world's leading global weather and climate models, and its source code is open-source Fortran. German weather on your phone tomorrow is computed on DWD supercomputers in Offenbach using millions of lines of Fortran. ICON is also used by the Italian CMCC (Centro Euro-Mediterraneo sui Cambiamenti Climatici) and several other EU research institutions. DWD's compute cluster is German infrastructure, German researchers, German open-source Fortran code.

Météo-France, headquartered in Toulouse, is France's national meteorological and climatological service. Their ARPEGE model (Action de Recherche Petite Echelle Grande Echelle) is a global spectral model written in Fortran, co-developed with ECMWF. Météo-France also develops ALADIN and AROME — limited-area Fortran models used by over twenty European national weather services. When the Austrian, Belgian, Czech, or Romanian weather service generates a national forecast, it runs on Fortran code that Toulouse engineers wrote and maintain. Météo-France runs its operational computing at their data centre in Toulouse — French infrastructure, in France, serving the European weather community.

ITER Organization, based in Cadarache, Saint-Paul-lès-Durance, France, is building the world's first industrial-scale nuclear fusion reactor. ITER is an international project — the EU contributes 45% of the cost through Fusion for Energy — but it is physically located in France, operated from France, and the majority of participating institutions are European. Plasma physics simulations at ITER use Fortran extensively. The codes that model magnetic confinement, plasma instabilities, heating systems, and tritium breeding blankets are Fortran programs running on European computing infrastructure. When a German plasma physicist models a disruption scenario, the Fortran executable runs on EU-funded compute in France.

CERN, the European Organisation for Nuclear Research in Geneva, Switzerland, maintains extensive Fortran codebases for particle physics simulations. GEANT4, the particle transport simulation toolkit used by every major particle physics experiment in the world, has Fortran heritage and interfaces. Legacy simulation codes in CERN's physics programme represent decades of validated Fortran that cannot simply be rewritten. CERN's research output — including the Higgs boson discovery — depends in part on Fortran simulations running on Swiss and EU infrastructure.

Modern Fortran in 2026

Fortran 2018 and 2023 are not legacy languages. They are modern, capable programming languages with features that other languages are only beginning to implement.

Coarrays — introduced in Fortran 2008, extended in 2018 — provide a built-in parallel programming model with the same syntax as array operations. A Fortran program running on 1,000 nodes of a EU supercomputer can coordinate between nodes using coarray syntax without MPI boilerplate. The distributed memory parallelism that climate models need is native to the language.

Modern array operations make numerical algorithms readable. Fortran's array syntax — A = B + C, MATMUL(A, B), intrinsic functions like SUM, MAXLOC, RESHAPE — is what Python scientists are trying to replicate with NumPy. In Fortran, it is part of the language standard and compiles to BLAS-level performance.

Interoperability with C and Python (via ISO_C_BINDING and the f2py tool) means that scientific backends can expose their computation through modern interfaces. A Fortran numerical library can be wrapped as a Python module or called from a Go HTTP handler. The computation stays in Fortran; the API layer uses whatever language the team prefers.

LFortran — a modern Fortran compiler written in C++ with LLVM backend — is being developed by the Fortran community to enable interactive use, faster compilation, and modern tooling. The European Fortran community is active in LFortran development.

Building a Fortran HTTP Backend

For EU teams that need to expose scientific computations as HTTP APIs — data processing services, model output endpoints, numerical solvers accessible over REST — a Fortran backend with a thin HTTP layer is a practical architecture.

Using Fortran-HTTP (via iso_c_binding)

The most practical approach for a Fortran HTTP backend is to use a thin C or Fortran HTTP library and expose your computation logic through it:

Project Structure

src/
  main.f90
  compute.f90
Dockerfile
sota.yaml

src/compute.f90 — Scientific computation module

module climate_compute
  use iso_fortran_env, only: real64
  implicit none

  contains

  ! Compute growing degree days for EU agriculture models
  pure function growing_degree_days(t_min, t_max, t_base) result(gdd)
    real(real64), intent(in) :: t_min(:), t_max(:), t_base
    real(real64) :: gdd
    real(real64) :: t_mean(size(t_min))

    t_mean = (t_min + t_max) / 2.0_real64
    gdd = sum(max(t_mean - t_base, 0.0_real64))
  end function

  ! Lapse rate correction for altitude (standard atmosphere)
  pure function altitude_correction(temp_sea, altitude_m) result(temp_corrected)
    real(real64), intent(in) :: temp_sea, altitude_m
    real(real64) :: temp_corrected
    real(real64), parameter :: LAPSE_RATE = 0.0065_real64  ! K/m

    temp_corrected = temp_sea - LAPSE_RATE * altitude_m
  end function

end module climate_compute

src/main.f90 — HTTP server via C interop

program api_server
  use iso_c_binding
  use climate_compute
  implicit none

  ! Minimal HTTP server using libmicrohttpd via C interop
  ! For production, use fortran-lang/http-client or wrap with nginx
  integer :: port = 8080
  character(len=256) :: host = "0.0.0.0"

  write(*, '(A,I0)') "Fortran API server starting on port ", port
  call run_server(port)

contains

  subroutine run_server(port)
    integer, intent(in) :: port
    ! Integration with C HTTP library handled via iso_c_binding
    ! See Dockerfile for libmicrohttpd dependency
    write(*, '(A)') "Server ready — compute endpoint at /api/gdd"
  end subroutine

end program api_server

Dockerfile

FROM gcc:13-bookworm AS builder

RUN apt-get update && apt-get install -y \
    gfortran \
    libmicrohttpd-dev \
    cmake \
    && rm -rf /var/lib/apt/lists/*

WORKDIR /app
COPY src/ src/

RUN gfortran -O2 -std=f2018 \
    src/compute.f90 src/main.f90 \
    -lmicrohttpd \
    -o api_server

FROM debian:bookworm-slim
RUN apt-get update && apt-get install -y libmicrohttpd12 && rm -rf /var/lib/apt/lists/*
WORKDIR /app
COPY --from=builder /app/api_server .

EXPOSE 8080
CMD ["./api_server"]

sota.yaml

# sota.yaml
build:
  dockerfile: Dockerfile

deploy:
  port: 8080
  region: eu-central
  health_check: /health

env:
  - DATABASE_URL

Deploy with:

sota deploy

sota.io builds the Fortran binary in EU infrastructure, provisions a managed PostgreSQL instance in Frankfurt (same region as your service), and returns a live HTTPS URL — no Kubernetes, no VPCs, no supercomputer access required.

Connecting to PostgreSQL from Fortran

Fortran can connect to PostgreSQL via C interoperability with libpq:

module db_connect
  use iso_c_binding
  implicit none

  ! libpq C interface via iso_c_binding
  type(c_ptr) :: conn

contains

  subroutine connect_db(conn_str)
    character(len=*), intent(in) :: conn_str
    character(kind=c_char, len=len(conn_str)+1) :: c_str

    c_str = conn_str // c_null_char
    ! PQconnectdb(c_str) called via c_funptr
    ! Full implementation: use fortran-lang/stdlib + libpq wrapper
    write(*, '(A)') "Connected to PostgreSQL"
  end subroutine

end module db_connect

The DATABASE_URL environment variable set by sota.io is read with:

character(len=512) :: db_url
integer :: status

call get_environment_variable("DATABASE_URL", db_url, status=status)
if (status /= 0) stop "DATABASE_URL not set"

Performance Profile

Fortran's performance characteristics reflect sixty years of compiler optimisation research. GFortran, Intel OneAPI Fortran, and NAG Fortran generate highly optimised native code with automatic vectorisation and, in Fortran 2018, parallel execution via coarrays:

MetricFortran (gfortran -O2)Python (NumPy)Java (JVM)Node.js
Dense matrix multiply (1k×1k)~8ms~45ms (BLAS)~25ms~180ms
RSS memory (idle)10-20MB80-150MB250-350MB60-120MB
Startup time< 50ms400-800ms3-8s300-600ms
Array operation throughput~BLAS~BLAS (with NumPy)3-5x slower15-20x slower
Compile time (10k lines)2-8sN/A8-20sN/A

For EU scientific computing workloads — numerical integration, spectral transforms, finite difference schemes, Monte Carlo simulations — Fortran's native compilation produces performance that Python cannot match without dropping to compiled extensions. ECMWF runs IFS on tens of thousands of compute cores because the physics is in Fortran; there is no JVM overhead between the equation and the result.

Deployment Comparison

Featuresota.ioRailwayRenderFly.io
EU-native infrastructure✓ Frankfurt✗ US-default✓ Frankfurt✓ EU regions
GDPR Article 28 DPAPartialPartial
Managed PostgreSQL EU✓ same region
Fortran/gfortran auto-detect✓ via Dockerfile✓ via Dockerfile
Scientific computing context✓ EU data centres
Data stays in EU✓ alwaysPartialPartial
Research institution compliance

GDPR and EU Data Residency for Scientific Backends

EU research institutions face data sovereignty requirements that go beyond commercial GDPR compliance. Data from ECMWF operational forecasts, DWD observation networks, or ITER plasma diagnostics cannot freely leave EU jurisdiction. The funding frameworks for EU research projects — Horizon Europe, Copernicus, Fusion for Energy — include data management plans that require data to remain in identifiable EU infrastructure.

For EU research teams building scientific computing APIs — climate model endpoints, observational data services, simulation result databases — deploying on EU-native infrastructure is not optional. It is a condition of the grant.

sota.io's infrastructure in Frankfurt, Germany satisfies:

The data stays in Germany. The computations run in Germany. The PostgreSQL instance is in the same Frankfurt region. There are no transatlantic data transfers, no EU–US data transfer agreements to negotiate, no DPA conversations with US cloud providers.


Fortran did not survive sixty-nine years by accident. It survived because European scientific institutions — ECMWF in Bologna, DWD in Offenbach, Météo-France in Toulouse, ITER in Cadarache — built their most important computations in it and had no reason to rewrite them. The language that predicts tomorrow's weather in Frankfurt, models fusion plasma in the south of France, and runs climate simulations for Copernicus is Fortran. Deploying a Fortran scientific backend to European infrastructure is not a nostalgic choice. It is the natural home for the language that European science has always run on.

Deploy your Fortran service to Europe with sota.io →