ARTFEED — Contemporary Art Intelligence

First Theoretical Framework for Spiking Transformers Establishes Expressivity and Efficiency

ai-technology · 2026-04-20

A groundbreaking theoretical model for spiking transformers has been introduced, achieving comparable accuracy to traditional transformers while enhancing energy efficiency by 38 to 57 times on neuromorphic hardware. The study, available on arXiv with the identifier 2604.15769v1, presents the inaugural comprehensive expressivity theory for spiking self-attention. It demonstrates that spiking attention utilizing Leaky Integrate-and-Fire neurons can universally approximate continuous permutation-equivariant functions. The paper includes explicit spike circuit designs, featuring an innovative lateral inhibition network for softmax normalization that shows O(1/√T) convergence. Additionally, it establishes tight lower bounds on spike counts through rate-distortion theory, indicating that ε-approximation necessitates Ω(L_f² nd/ε²) spikes. Notably, input-dependent bounds are derived from effective dimensions, ranging from 47 to 89 for datasets such as CIFAR and ImageNet, clarifying why T=4 time steps are often sufficient despite theoretical expectations. This research fills a significant gap in the theoretical framework for creating these energy-efficient neural network models.

Key facts

  • Spiking transformers achieve competitive accuracy with conventional transformers.
  • They offer 38-57× energy efficiency on neuromorphic hardware.
  • The paper establishes the first comprehensive expressivity theory for spiking self-attention.
  • Spiking attention with Leaky Integrate-and-Fire neurons is a universal approximator of continuous permutation-equivariant functions.
  • Explicit spike circuit constructions include a novel lateral inhibition network for softmax normalization.
  • Convergence is proven at O(1/√T).
  • Tight spike-count lower bounds are derived via rate-distortion theory.
  • Effective dimensions measured 47-89 for CIFAR/ImageNet datasets.

Entities

Institutions

  • arXiv

Sources