ARTFEED — Contemporary Art Intelligence

LMNet: Dense Communication for Language Model Networks

ai-technology · 2026-05-14

A new paper on arXiv (2505.12741v2) introduces LMNet, a framework for language model networks where pre-trained LLMs act as nodes connected by trainable seq2seq modules. Unlike existing systems that communicate through discrete natural language, LMNet enables dense vector exchanges between intermediate nodes, allowing end-to-end gradient optimization and efficient information transfer. The approach bypasses intermediate embedding and de-embedding steps, preserving natural-language input and output only at system boundaries. This design addresses inefficiencies in current multi-model collaboration and test-time reasoning systems, making communication differentiable and learnable from end-task supervision.

Key facts

  • arXiv:2505.12741v2
  • Title: Language Model Networks: Supervision-Efficient Learning through Dense Communication
  • LMNet uses stripped LLMs as vertex modules
  • Trainable seq2seq modules serve as communication edges
  • Enables dense vector exchange between intermediate nodes
  • Preserves natural-language input and output at system boundary
  • Bypasses intermediate embedding and de-embedding
  • Allows end-to-end gradient optimization

Entities

Institutions

  • arXiv

Sources