Quasi-Quadratic Gradient Accelerates BFGS Optimization Method
A recent publication on arXiv presents the Quasi-Quadratic Gradient (QQG), a novel search direction aimed at enhancing the BFGS method used in quasi-Newton optimization. The QQG is characterized as the multiplication of the inverse Hessian approximation with the existing gradient, utilizing local second-order curvature to refine the search trajectory. Both theoretical assessments and experimental findings indicate that QQG surpasses the standard BFGS regarding convergence rate, all while preserving computational efficiency. This work falls under the category of Optimization and Control.
Key facts
- The paper introduces Quasi-Quadratic Gradient (QQG) for BFGS acceleration.
- QQG is defined as the product of inverse Hessian approximation and current gradient.
- QQG leverages local second-order curvature to rectify the search path.
- Theoretical analysis shows QQG outperforms vanilla BFGS in convergence speed.
- Empirical results confirm computational efficiency is maintained.
- The paper is submitted to arXiv under Optimization and Control.
- The arXiv ID is 2604.23922.
- The method is within the quasi-Newton framework.
Entities
Institutions
- arXiv