New Algorithm for Training Neural Networks with Linear Constraints
Researchers have introduced an efficient algorithm for end-to-end training of deep neural networks with linear constraints on selected layer outputs. The key innovation is the HS-Jacobian, a computable mapping for projection layers onto polyhedral sets, which overcomes the nonsmoothness bottleneck in backpropagation. This enables the use of standard optimizers like Adam for constrained neural networks. The work provides rigorous convergence guarantees and is detailed in arXiv preprint 2605.11526.
Key facts
- Training deep neural networks with linear constraints on layer outputs is addressed.
- HS-Jacobian is introduced as an efficiently computable mapping for projection layers.
- HS-Jacobian is proven to be a conservative mapping for projection onto polyhedral sets.
- Integration into nonsmooth automatic differentiation framework for backpropagation is enabled.
- Algorithms like Adam can be applied for end-to-end training.
- The research is published as arXiv:2605.11526v1.
- The work addresses a key difficulty in backpropagation theory and algorithms.
Entities
Institutions
- arXiv