ARTFEED — Contemporary Art Intelligence

Epistemic Nearest Neighbors: A Scalable Alternative to Gaussian Processes in Bayesian Optimization

other · 2026-05-07

A new method called Epistemic Nearest Neighbors (ENN) is proposed as a lightweight alternative to Gaussian processes (GPs) for Bayesian optimization (BO). Traditional BO relies on GP surrogates, which scale poorly with large datasets: hyperparameter fitting costs O(N^3) or roughly O(N^2) in modern implementations, repeated at each BO iteration. ENN estimates function values and uncertainty (both epistemic and aleatoric) using K-nearest-neighbor observations, achieving O(N) scaling for both fitting and acquisition. This addresses the bottleneck of GP hyperparameter fitting in regimes where function evaluation is cheaper and observations are plentiful. The method is introduced in a paper on arXiv (2506.12818).

Key facts

  • Bayesian optimization traditionally solves black-box problems with expensive function evaluations.
  • Recent interest in BO for problems with cheaper evaluations and more observations.
  • GP hyperparameter fitting scales as O(N^3) or roughly O(N^2) in modern implementations.
  • GP fitting is repeated at every BO iteration, making it the bottleneck.
  • Epistemic Nearest Neighbors (ENN) is a lightweight alternative to GPs.
  • ENN estimates function values and uncertainty from K-nearest-neighbor observations.
  • ENN scales as O(N) for both fitting and acquisition.
  • The paper is available on arXiv with identifier 2506.12818.

Entities

Institutions

  • arXiv

Sources