An Efficient ADMM Method for Ratio-Type Nonconvex and Nonsmooth Minimization in Sparse Recovery
Lang Yu, Nanjing Huang
Published: 2025/9/26
Abstract
Sparse signal recovery based on nonconvex and nonsmooth optimization problems has significant applications and demonstrates superior performance in signal processing and machine learning. This work deals with a scale-invariant $\ell_{1/2}/\ell_{2}$ sparse minimization with nonconvex, nonseparable, ratio-type regularization to enhance the accuracy and stability of sparse recovery. Within the framework of the null space property, we analyze the conditions for exact and stable recovery in constrained minimization problem. For the unconstrained regularized minimization problem, we develop an alternating direction method of multipliers (ADMM) based on a splitting strategy and rigorously analyze its global convergence and linear convergence rate under reasonable assumptions. Numerical experiments demonstrate that the proposed method consistently outperforms existing approaches across diverse noise levels and measurement settings. Furthermore, experiments on neural network sparsity and generalization performance demonstrate that the method effectively improves prediction accuracy.