Adaptive Quasi-Newton and Anderson Acceleration with Global Convergence Rates

Thumbnail

Event details

Date 08.05.2024
Hour 11:1512:15
Speaker Damien Scieur
Location
Category Conferences - Seminars

Despite the impressive numerical performance of the quasi-Newton and Anderson/nonlinear acceleration methods, their global convergence rates have remained elusive for over 50 years. This study addresses this long-standing issue by introducing a framework that derives novel, adaptive quasi-Newton and nonlinear/Anderson acceleration schemes. Under mild assumptions, the proposed iterative methods exhibit explicit, non-asymptotic convergence rates that blend those of the gradient descent and Cubic Regularized Newton's methods. The proposed approach also includes an accelerated version for convex functions. Notably, these rates are achieved adaptively without prior knowledge of the function's parameters. The framework presented in this study is generic, and its special cases include algorithms such as Newton's method with random subspaces, finite differences, or lazy Hessian. Numerical experiments demonstrated the efficiency of the proposed framework, even compared to the l-BFGS algorithm with Wolfe line-search.

Practical information

  • General public
  • Free

Contact

  • Nicolas Boumal Séverine Eggli

Event broadcasted in

Share