A Riemannian conjugate subgradient method for nonconvex and nonsmooth optimization on manifolds
Chunming Tang, Shaohui Liang, Huangyue Chen
公開日: 2025/9/7
Abstract
Conjugate gradient (CG) methods are widely acknowledged as efficient for minimizing continuously differentiable functions in Euclidean spaces. In recent years, various CG methods have been extended to Riemannian manifold optimization, but existing Riemannian CG methods are confined to smooth objective functions and cannot handle nonsmooth ones. This paper proposes a Riemannian conjugate subgradient method for a class of nonconvex, nonsmooth optimization problems on manifolds. Specifically, we first select a Riemannian subgradient from the convex hull of two directionally active subgradients. The search direction is then defined as a convex combination of the negative of this subgradient and the previous search direction transported to the current tangent space. Additionally, a Riemannian line search with an interval reduction procedure is integrated to generate an appropriate step size, ensuring the objective function values form a monotonically nonincreasing sequence. We establish the global convergence of the algorithm under mild assumptions. Numerical experiments on three classes of Riemannian optimization problems show that the proposed method takes significantly less computational time than related existing methods. To our knowledge, this is the first CG-type method developed for Riemannian nonsmooth optimization.