MambaCSP: Hybrid-Attention SSM for Efficient Channel State Prediction
Researchers propose MambaCSP, a hybrid-attention state space model (SSM) for hardware-efficient channel state prediction (CSP) in wireless communications. The model replaces transformer and large language model (LLM) backbones, which suffer from quadratic scaling in sequence length, with a linear-time Mamba model. To address the local-only dependency limitation of pure SSMs, lightweight patch-mixer attention layers are introduced to periodically inject cross-token attentions. This approach aims to reduce computational cost, memory consumption, and inference latency, making CSP feasible for real-time and resource-constrained deployments. The paper is available on arXiv under reference 2604.21957.
Key facts
- MambaCSP is a hybrid-attention SSM architecture for channel state prediction.
- It replaces LLM-based prediction backbones with a linear-time Mamba model.
- Lightweight patch-mixer attention layers inject cross-token attentions periodically.
- The model addresses quadratic scaling issues of transformers and LLMs.
- It targets real-time and resource-constrained wireless deployments.
- The paper is published on arXiv with ID 2604.21957.
- Selective state space models are investigated as hardware-efficient alternatives.
- The work focuses on capturing long-range temporal dependencies in CSI sequences.
Entities
Institutions
- arXiv