A Proximal Descent Method for Minimizing Weakly Convex Optimization
Feng-Yi Liao, Yang Zheng
Published: 2025/9/2
Abstract
We study the problem of minimizing a $m$-weakly convex and possibly nonsmooth function. Weak convexity provides a broad framework that subsumes convex, smooth, and many composite nonconvex functions. In this work, we propose a $\textit{proximal descent method}$, a simple and efficient first-order algorithm that combines the inexact proximal point method with classical convex bundle techniques. Our analysis establishes explicit non-asymptotic convergence rates in terms of $(\eta,\epsilon)$-inexact stationarity. In particular, the method finds an $(\eta,\epsilon)$-inexact stationary point using at most $\mathcal{O}\!\left( \Big(\tfrac{1}{\eta^2} + \tfrac{1}{\epsilon}\Big) \max\!\left\{\tfrac{1}{\eta^2}, \tfrac{1}{\epsilon}\right\} \right)$ function value and subgradient evaluations. Consequently, the algorithm also achieves the best-known complexity of $\mathcal{O}(1/\delta^4)$ for finding an approximate Moreau stationary point with $\|\nabla f_{2m}(x)\|\leq \delta$. A distinctive feature of our method is its \emph{automatic adaptivity}: with no parameter tuning or algorithmic modification, it accelerates to $\mathcal{O}(1/\delta^2)$ complexity under smoothness and further achieves linear convergence under quadratic growth. Overall, this work bridges convex bundle methods and weakly convex optimization, while providing accelerated guarantees under structural assumptions.