ARTFEED — Contemporary Art Intelligence

VAE-Inf: A generative framework for imbalanced classification

other · 2026-04-30

A new two-stage framework called VAE-Inf integrates variational autoencoders with hypothesis testing to address imbalanced classification when minority samples are extremely scarce. In the first stage, a VAE is trained exclusively on majority-class data to model the reference distribution, and latent posteriors are aggregated via a Wasserstein barycenter into a global Gaussian reference. The second stage transforms this representation for statistical inference, enabling reliable error control and stable decision boundaries. The approach bridges generative modeling and discriminative classification, offering a statistically interpretable solution for extreme class imbalance.

Key facts

  • VAE-Inf is a two-stage framework for imbalanced classification
  • First stage trains VAE on majority-class data only
  • Latent posteriors aggregated via Wasserstein barycenter
  • Global Gaussian reference model constructed
  • Second stage transforms representation for hypothesis testing
  • Aims to provide reliable error control
  • Bridges generative modeling and discriminative classification
  • Published on arXiv with ID 2604.25334

Entities

Institutions

  • arXiv

Sources