GNNs and Transformers for Physics Simulations: A Unified Framework
A new arXiv paper (2605.01542) identifies a critical bottleneck in the use of Graph Neural Networks (GNNs) and Transformers as machine learning surrogates for Computational Fluid Dynamics (CFD). While architectures have advanced, training paradigms remain bound to naive assumptions like node-wise supervision and explicit Euler time-stepping, which ignore stiff dynamics and local flux continuity inherent to methods such as Finite Element, Difference, or Volume (FEM). The authors propose a unified framework bridging geometric deep learning and numerical analysis, introducing three innovations: (1) Multi Node Prediction, a stencil-level objective enforcing spatial derivative constraints; (2) temporal awareness mechanisms; and (3) integration of rigorous numerical principles. The work aims to accelerate physics simulations by overcoming legacy training choices.
Key facts
- arXiv paper 2605.01542 identifies a critical bottleneck in ML surrogates for CFD
- GNNs and Transformers are used for accelerating physics simulations
- Current training paradigms use node-wise supervision and explicit Euler time-stepping
- These ignore stiff dynamics and local flux continuity in FEM, FDM, FVM
- Proposes a unified framework bridging geometric deep learning and numerical analysis
- Introduces Multi Node Prediction as a stencil-level objective
- Enforces spatial derivative constraints for local topology
- Aims to improve accuracy and efficiency of physics simulations
Entities
Institutions
- arXiv