Delta Variances: Efficient Epistemic Uncertainty for Neural Networks
A new family of algorithms called Delta Variances enables efficient epistemic uncertainty quantification for large neural networks. The approach requires only a single gradient computation, no architectural changes, and can be applied to general functions composed of neural networks, including weather simulators with neural-network-based step functions. Empirical results show competitive performance. The paper discusses multiple theoretical derivations, unifying popular techniques under a common framework.
Key facts
- Delta Variances is a family of algorithms for epistemic uncertainty quantification.
- It is computationally efficient and convenient to implement.
- Requires no changes to neural network architecture or training procedure.
- Can be applied to neural networks and more general functions composed of neural networks.
- Example application: weather simulator with neural-network-based step function.
- Empirically obtains competitive results at the cost of a single gradient computation.
- Paper discusses multiple theoretical derivations.
- Special cases recover popular techniques.
Entities
—