Universal Complexity Bounds for Universal Gradient Methods in Nonlinear Optimization
Yurii Nesterov
公開日: 2025/9/25
Abstract
In this paper, we provide the universal first-order methods of Composite Optimization with new complexity analysis. It delivers some universal convergence guarantees, which are not linked directly to any parametric problem class. However, they can be easily transformed into the rates of convergence for the particular problem classes by substituting the corresponding upper estimates for the Global Curvature Bound of the objective function. We analyze in this way the simple gradient method for nonconvex minimization, gradient methods for convex composite optimization, and their accelerated variant. For them, the only input parameter is the required accuracy of the approximate solution. The accelerated variant of our scheme automatically ensures the best possible rate of convergence simultaneously for all parametric problem classes containing the smooth part of the objective function.