Exact Stiefel Optimization Improves Probabilistic PLS for Two-View Learning
A recent preprint on arXiv presents a comprehensive framework for probabilistic partial least squares (PPLS) that tackles existing challenges in fitting processes. This work builds upon the identifiable parameterization established by Bouhaddani et al. (2018) and the fixed-noise scalar-likelihood method proposed by Hu et al. (2025). The authors enhance the approach by substituting full-spectrum noise averaging with noise-subspace estimation and employing exact Stiefel-manifold optimization instead of interior-point penalty handling. The noise-subspace estimator provides a leading finite-sample rate that is independent of signal strength and aligns with a minimax bound. The framework integrates noise pre-estimation, constrained likelihood optimization, and prediction calibration, yielding closed-form updates, error bounds, and calibrated uncertainty for two-view learning tasks.
Key facts
- arXiv:2605.11607v1, cross type
- Probabilistic partial least squares (PPLS) is a likelihood-based model for two-view learning
- Builds on identifiable parameterization of Bouhaddani et al. (2018)
- Builds on fixed-noise scalar-likelihood line of Hu et al. (2025)
- Replaces full-spectrum noise averaging with noise-subspace estimation
- Replaces interior-point penalty handling with exact Stiefel-manifold optimization
- Noise-subspace estimator attains signal-strength-independent leading finite-sample rate
- Matches a minimax bound
Entities
Institutions
- arXiv