CaCuTe: Casual Cubic-Model Technique for Faster Optimization
Nazarii Tupitsa
公開日: 2025/9/23
Abstract
We establish a local $\mathcal{O}(k^{-2})$ rate for the gradient update $x^{k+1}=x^k-\nabla f(x^k)/\sqrt{H\|\nabla f(x^k)\|}$ under a $2H$-Hessian--Lipschitz assumption. Regime detection relies on Hessian--vector products, avoiding Hessian formation or factorization. Incorporating this certificate into cubic-regularized Newton (CRN) and an accelerated variant enables per-iterate switching between the cubic and gradient steps while preserving CRN's global guarantees. The technique achieves the lowest wall-clock time among compared baselines in our experiments. In the first-order setting, the technique yields a monotone, adaptive, parameter-free method that inherits the local $\mathcal{O}(k^{-2})$ rate. Despite backtracking, the method shows superior wall-clock performance. Additionally, we cover smoothness relaxations beyond classical gradient--Lipschitzness, enabling tighter bounds, including global $\mathcal{O}(k^{-2})$ rates. Finally, we generalize the technique to the stochastic setting.