FastPart: Over-Parameterized Stochastic Gradient Descent for Sparse optimisation on Measures
Yohann De Castro, Sébastien Gadat, Clément Marteau
Published: 2023/12/10
Abstract
This paper presents a novel algorithm that leverages Stochastic Gradient Descent strategies in conjunction with Random Features to augment the scalability of Conic Particle Gradient Descent (CPGD) specifically tailored for solving sparse optimization problems on measures. By formulating the CPGD steps within a variational framework, we provide rigorous mathematical proofs demonstrating the following key findings: $\mathrm{(i)}$ The total variation norms of the solution measures along the descent trajectory remain bounded, ensuring stability and preventing undesirable divergence; $\mathrm{(ii)}$ We establish a global convergence guarantee with a convergence rate of ${O}(\log(K)/\sqrt{K})$ over $K$ iterations, showcasing the efficiency and effectiveness of our algorithm, $\mathrm{(iii)}$ Additionally, we analyse and establish local control over the first-order condition discrepancy, contributing to a deeper understanding of the algorithm's behaviour and reliability in practical applications.