ARTFEED — Contemporary Art Intelligence

Anon Optimizer Bridges SGD and Adam with Tunable Adaptivity

ai-technology · 2026-05-06

Researchers propose Anon, a novel optimizer with continuously tunable adaptivity that interpolates between SGD-like and Adam-like behaviors and extrapolates beyond both. The key innovation is incremental delay update (IDU), a mechanism more flexible than AMSGrad's hard max-tracking, enhancing robustness to gradient noise. The method addresses the generalization gap of adaptive optimizers like Adam on classical architectures such as CNNs. Theoretical convergence is established across the entire adaptivity spectrum. The work is published on arXiv under identifier 2605.02317.

Key facts

  • Anon optimizer allows continuous tuning of adaptivity in R.
  • IDU mechanism replaces AMSGrad's hard max-tracking.
  • Targets generalization gap of Adam on CNNs.
  • Theoretical convergence proven for full adaptivity spectrum.
  • Published on arXiv:2605.02317.
  • Adaptive optimizers like Adam generalize worse than SGD on classical architectures.
  • Anon can interpolate between SGD and Adam behaviors.
  • IDU enhances robustness to gradient noise.

Entities

Institutions

  • arXiv

Sources