Improved Rates for Stochastic Variance-Reduced Difference-of-Convex Algorithms
Anh Duc Nguyen, Alp Yurtsever, Suvrit Sra, Kim-Chuan Toh
公開日: 2025/9/15
Abstract
In this work, we propose and analyze DCA-PAGE, a novel algorithm that integrates the difference-of-convex algorithm (DCA) with the ProbAbilistic Gradient Estimator (PAGE) to solve structured nonsmooth difference-of-convex programs. In the finite-sum setting, our method achieves a gradient computation complexity of $O(N + N^{1/2}\varepsilon^{-2})$ with sample size $N$, surpassing the previous best-known complexity of $O(N + N^{2/3}\varepsilon^{-2})$ for stochastic variance-reduced (SVR) DCA methods. Furthermore, DCA-PAGE readily extends to online settings with a similar optimal gradient computation complexity $O(b + b^{1/2}\varepsilon^{-2})$ with batch size $b$, a significant advantage over existing SVR DCA approaches that only work for the finite-sum setting. We further refine our analysis with a gap function, which enables us to obtain comparable convergence guarantees under milder assumptions.