A Riemannian Accelerated Proximal Gradient Method
Shuailing Feng, Yuhang Jiang, Wen Huang, Shihui Ying
公開日: 2025/9/26
Abstract
Riemannian accelerated gradient methods have been well studied for smooth optimization, typically treating geodesically convex and geodesically strongly convex cases separately. However, their extension to nonsmooth problems on manifolds with theoretical acceleration remains underexplored. To address this issue, we propose a unified Riemannian accelerated proximal gradient method for problems of the form $F(x) = f(x) + h(x)$ on manifolds, where $f$ can be either geodesically convex or geodesically strongly convex, and $h$ is $\rho$-retraction-convex, possibly nonsmooth. We rigorously establish accelerated convergence rate under reasonable conditions. Additionally, we introduce a safeguard mechanism to ensure global convergence in non-convex settings. Numerical results validate the theoretical acceleration of the proposed method.