Mahalanobis Distance Variance for OOD Detection Under Neural Collapse
A new arXiv paper (2605.14413) proposes using class-wise Mahalanobis distance variance as an out-of-distribution (OOD) detection score. The authors observe that in-distribution samples exhibit a sharp minimum structure with high variance across classes, while OOD samples show lower variance. They provide theoretical grounding via Neural Collapse geometry, showing that under relaxed assumptions of within-class compactness and inter-class separation, high variance is structurally expected for ID samples. The method offers a theoretically motivated OOD detection approach.
Key facts
- arXiv paper 2605.14413 proposes OOD detection via class-wise Mahalanobis distance variance.
- ID samples show sharp minimum structure with high variance across classes.
- OOD samples exhibit lower variance across classes.
- Theoretical analysis grounds the observation in Neural Collapse geometry.
- Relaxed Neural Collapse assumptions on within-class compactness and inter-class separation are used.
- High class-wise distance variance is structurally expected for ID samples under these assumptions.
- The method is motivated by empirical observation of distance distributions.
- OOD detection is critical for safety-critical deep neural network applications.
Entities
Institutions
- arXiv