ARTFEED — Contemporary Art Intelligence

Goldstone Modes Enable Deep Signal Propagation in Neural Networks

other · 2026-05-16

A new study from arXiv (2605.14685) demonstrates that deep neural networks with continuous symmetry equivariance can support Goldstone-like modes, enabling coherent signal propagation across depth and recurrent iterations without architectural stabilizers like residual connections or normalization. The researchers show analytically and empirically that these degrees of freedom improve trainability and representational diversity in feedforward networks, and enhance long-term memory in recurrent settings by propagating information over iterations. The work draws an analogy to spontaneous symmetry breaking in physics, where Goldstone modes allow information propagation over long distances and times.

Key facts

  • arXiv paper 2605.14685 studies deep neural networks with continuous symmetry equivariance.
  • Goldstone-like modes enable coherent signal propagation across depth and recurrent iterations.
  • Mechanism works without residual connections or normalization.
  • Feedforward networks show improved trainability and representational diversity.
  • Recurrent settings benefit from long-term memory via Goldstone modes.
  • Analogy to spontaneous symmetry breaking in physics.
  • Both analytical and empirical evidence provided.

Entities

Institutions

  • arXiv

Sources