Convexity of Optimization Curves: Local Sharp Thresholds, Robustness Impossibility, and New Counterexamples

Le Duc Hieu

公開日: 2025/9/10

Abstract

We study when the \emph{optimization curve} of first-order methods -- the sequence \${f(x\_n)}*{n\ge0}\$ produced by constant-stepsize iterations -- is convex, equivalently when the forward differences \$f(x\_n)-f(x*{n+1})\$ are nonincreasing. For gradient descent (GD) on convex \$L\$-smooth functions, the curve is convex for all stepsizes \$\eta \le 1.75/L\$, and this threshold is tight. Moreover, gradient norms are nonincreasing for all \$\eta \le 2/L\$, and in continuous time (gradient flow) the curve is always convex. These results complement and refine the classical smooth convex optimization toolbox, connecting discrete and continuous dynamics as well as worst-case analyses.

Convexity of Optimization Curves: Local Sharp Thresholds, Robustness Impossibility, and New Counterexamples | SummarXiv | SummarXiv