ARTFEED — Contemporary Art Intelligence

STORM Transformer Achieves Exascale Generative Data Assimilation on Frontier Supercomputer

ai-technology · 2026-04-22

A novel generative data assimilation framework has been introduced to address a fundamental bottleneck in Earth system prediction. This approach reformulates assimilation as Bayesian posterior sampling, replacing traditional forecast-update cycles with compute-dense, GPU-efficient inference. At its core is STORM, a spatiotemporal transformer featuring a global attention linear-complexity scaling algorithm that overcomes quadratic attention limitations. On the Frontier supercomputer using 32,768 GPUs, the method demonstrated 63% strong scaling efficiency while sustaining 1.6 ExaFLOP performance. The system further scales to handle 20 billion spatiotemporal tokens. Accurate weather and climate forecasting depends on data assimilation, which estimates Earth system states by combining observations with models. Although exascale computing has advanced earth simulation capabilities, scalable and precise inference of Earth system states remains challenging, restricting uncertainty quantification and extreme event prediction.

Key facts

  • Generative data assimilation framework introduced for Earth system prediction
  • Reformulates assimilation as Bayesian posterior sampling
  • Uses compute-dense, GPU-efficient inference instead of forecast-update cycles
  • STORM transformer features global attention linear-complexity scaling algorithm
  • Breaks quadratic attention barrier
  • Achieved 63% strong scaling efficiency on Frontier supercomputer
  • Sustained 1.6 ExaFLOP performance on 32,768 GPUs
  • Scales to 20 billion spatiotemporal tokens

Entities

Institutions

  • Frontier

Sources