ARTFEED — Contemporary Art Intelligence

LAM-PINN: A Compositional Meta-Learning Framework for Physics-Informed Neural Networks

ai-technology · 2026-05-01

A recent preprint on arXiv (2604.26999) presents the Learning-Affinity Adaptive Modular Physics-Informed Neural Network (LAM-PINN), a meta-learning framework aimed at addressing the challenges posed by task heterogeneity in physics-informed neural networks (PINNs). PINNs work by integrating physical laws into their loss functions to solve partial differential equations (PDEs). In families of parameterized PDEs, differences in coefficients or boundary and initial conditions lead to unique tasks, making it costly to train separate PINNs for each one. Traditional meta-learning approaches often depend on a single global initialization, which can result in negative transfer, especially with limited training tasks and scarce features. LAM-PINN overcomes this by merging PDE parameters with learning-affinity metrics from short transfer sessions, facilitating more effective cross-task learning while minimizing retraining expenses and avoiding negative transfer.

Key facts

  • arXiv preprint 2604.26999 introduces LAM-PINN.
  • LAM-PINN stands for Learning-Affinity Adaptive Modular Physics-Informed Neural Network.
  • PINNs approximate solutions to PDEs by embedding physical laws into the loss function.
  • Parameterized PDE families have variations in coefficients or boundary/initial conditions defining distinct tasks.
  • Training individual PINNs for each task is computationally prohibitive.
  • Existing meta-learning methods rely on a single global initialization.
  • Negative transfer can occur under feature-scarce coordinate inputs and limited training-task availability.
  • LAM-PINN uses PDE parameters and learning-affinity metrics from brief transfer sessions.

Entities

Institutions

  • arXiv

Sources