ARTFEED — Contemporary Art Intelligence

Width Wall: Fundamental Limit for Hypergraph Neural Networks

other · 2026-05-14

A new theoretical framework reveals a fundamental limitation in hypergraph neural networks (HGNNs), termed the Width Wall. Researchers show that the expressivity of HGNNs is governed by their ability to detect and count small structural patterns, formalized via homomorphism densities. These densities generate all continuous hypergraph invariants and are organized into a strict hierarchy indexed by hypertree width. The Width Wall represents an architectural limit beyond which no hidden dimension, training procedure, or fixed-depth HGNN can represent invariants requiring wider patterns. The study provides a unified characterization of hypergraph expressivity, with implications for scientific, social, and biological systems modeled as hypergraphs.

Key facts

  • Hypergraph expressivity is governed by detection and counting of small patterns.
  • Homomorphism densities measure how often a structural motif appears in a hypergraph.
  • Homomorphism densities generate all continuous hypergraph invariants.
  • Invariants are organized into a strict hierarchy indexed by hypertree width.
  • The Width Wall is a fundamental architectural limit for HGNNs.
  • No hidden dimension or training procedure can overcome the Width Wall for fixed-depth HGNNs.
  • The framework provides a unified characterization of hypergraph expressivity.
  • The study appears on arXiv with ID 2605.13690.

Entities

Institutions

  • arXiv

Sources