Fast-Slow Recurrent Model Improves Long-Horizon Sequential Learning
A new machine learning method interleaves fast recurrent latent updates with slow observation updates to improve long-horizon sequential modeling. The approach enables stable internal structures that evolve with input, producing coherent and clustered representations over extended sequences. It outperforms LSTM, state space models, and Transformer variants in reinforcement learning and algorithmic tasks, particularly in out-of-distribution generalization. The work extends latent recurrent modeling to sequential input streams, leveraging self-organizational ability between update frequencies.
Key facts
- Method interleaves fast recurrent latent updates with slow observation updates.
- Facilitates learning of stable internal structures that evolve alongside input.
- Maintains coherent and clustered representations over long horizons.
- Improves out-of-distribution generalization in reinforcement learning and algorithmic tasks.
- Outperforms LSTM, state space models, and Transformer variants.
- Extends latent recurrent modeling to sequential input streams.
- Uses self-organizational ability between update frequencies.
- Published on arXiv under Computer Science > Machine Learning.
Entities
Institutions
- arXiv