Cumulative Memory Recurrent Unit Improves RNN Performance for Low-Power AI
A team of researchers has unveiled the Cumulative Memory Recurrent Unit (CMRU) along with its relaxed version, αCMRU, which builds upon the Bistable Memory Recurrent Unit (BMRU) to improve ultra-low power RNNs. Although BMRU was intended for hardware-software co-design with quantized states and hysteresis, it struggled with complex sequential tasks. The newly introduced cumulative update formulation enhances gradient flow by implementing skip-connections through time, effectively tackling gradient blocking during state updates. Experimental results indicate enhanced performance and stability in learning. This research focuses on ultra-low power applications, striving to achieve a balance between power efficiency and sequence learning capabilities. The paper can be found on arXiv, reference 2605.11855.
Key facts
- CMRU and αCMRU are new RNN variants based on BMRU.
- BMRU was introduced for ultra-low power hardware-software co-design.
- Gradient blocking during state updates was identified as a key limitation of BMRU.
- Cumulative update formulation restores gradient flow via skip-connections.
- Experiments show improved performance on complex sequential tasks.
- The work targets ultra-low power applications.
- Paper available on arXiv:2605.11855.
Entities
Institutions
- arXiv