ARTFEED — Contemporary Art Intelligence

New Optimization Framework Improves Physics-Informed Neural Network Training

ai-technology · 2026-04-20

A new research paper presents a lightweight optimization framework that is curvature-aware, aimed at improving the training process of Physics-Informed Neural Networks (PINNs). This approach tackles prevalent challenges such as slow convergence, instability during training, and diminished accuracy, which stem from the anisotropic and rapidly changing geometry of PINN loss landscapes. By utilizing consecutive gradient differences as an inexpensive indicator of local geometric variations, the framework integrates an adaptive predictive correction based on secant data. A step-normalized secant curvature indicator regulates the correction's intensity. This efficient plug-and-play method works well with existing first-order optimizers without needing explicit second-order matrix formation. Experiments on various partial differential equation benchmarks reveal significant enhancements in convergence speed, stability, and accuracy over standard optimizers and strong baselines. The paper is available on arXiv under identifier 2604.15392v1 and is classified as a cross announcement.

Key facts

  • Physics-Informed Neural Networks (PINNs) often suffer from slow convergence and training instability
  • The proposed framework uses consecutive gradient differences as a proxy for local geometric change
  • A step-normalized secant curvature indicator controls correction strength
  • The method is plug-and-play and computationally efficient
  • It works with existing first-order optimizers without forming second-order matrices
  • Experiments show improvements in convergence speed and solution accuracy
  • The paper addresses challenges from anisotropic loss landscapes in PINNs
  • The research was announced on arXiv with identifier 2604.15392v1

Entities

Institutions

  • arXiv

Sources