Variational Bayesian Framework for Joint Posterior-Predictive Inference
A new variational Bayesian framework directly targets the posterior-predictive distribution, jointly learning approximations of both the posterior and predictive distribution. This approach introduces a variational upper bound on Kullback-Leibler divergence with moment-based regularization. The method is trained in an amortized manner, aiming to reduce computational demands of traditional two-stage procedures, especially for high-fidelity models like those governed by partial differential equations.
Key facts
- arXiv:2605.03710v1
- Announce Type: cross
- Proposes variational Bayesian framework for posterior-predictive distribution
- Jointly learns variational approximations of posterior and predictive distribution
- Introduces variational upper bound on Kullback-Leibler divergence
- Uses moment-based regularization terms
- Trained in amortized manner
- Aims to reduce computational demands for high-fidelity models
Entities
—