Multi-Timescale Conductance Spiking Networks for Enhanced Temporal Processing
Researchers have introduced multi-timescale conductance spiking networks, a gradient-trainable framework that addresses limitations in spiking neural networks (SNNs) for temporal tasks. The framework shapes the current-voltage (I-V) curve by tuning fast, slow, and ultra-slow conductances, enabling systematic control over excitability and rich firing dynamics. This approach can be implemented efficiently in analog circuits and offers high activity sparsity, overcoming trade-offs between gradient-based trainability, dynamical richness, and sparsity common in state-of-the-art SNNs. The work targets regression tasks where approximation error, noise, and spike discretization degrade continuous-valued outputs. The paper is available on arXiv with ID 2605.11835.
Key facts
- Multi-timescale conductance spiking networks are introduced as a gradient-trainable framework.
- Neural dynamics emerge from shaping the I-V curve by tuning fast, slow, and ultra-slow conductances.
- The parametrization allows systematic control over excitability and yields rich firing regimes.
- The framework can be implemented efficiently in analog circuits.
- It addresses limitations in SNNs for temporal processing, particularly in regression tasks.
- State-of-the-art SNNs often rely on simple phenomenological dynamics with surrogate gradients.
- The paper is published on arXiv with ID 2605.11835.
- The approach offers high activity sparsity.
Entities
Institutions
- arXiv