A Globalized Semismooth Newton Method for Prox-regular Optimization Problems
Yuqia Wu, Pengcheng Wu, Yaohua Hu, Shaohua Pan, Xiaoqi Yang
公開日: 2025/9/6
Abstract
We are concerned with a class of nonconvex and nonsmooth composite optimization problems, comprising a twice differentiable function and a prox-regular function. We establish a sufficient condition for the proximal mapping of a prox-regular function to be single-valued and locally Lipschitz continuous. By virtue of this property, we propose a hybrid of proximal gradient and semismooth Newton methods for solving these composite optimization problems, which is a globalized semismooth Newton method. The whole sequence is shown to converge to an $L$-stationary point under a Kurdyka-{\L}ojasiewicz exponent assumption. Under an additional error bound condition and some other mild conditions, we prove that the sequence converges to a nonisolated $L$-stationary point at a superlinear convergence rate. Numerical comparison with several existing second order methods reveal that our approach performs comparably well in solving both the $\ell_q(0<q<1)$ quasi-norm regularized problems and the fused zero-norm regularization problems.