Hanke-Raus heuristic rule for iteratively regularized stochastic gradient descent

Harshit Bajpai, Gaurav Mittal, Ankik Kumar Giri

公開日: 2024/12/3

Abstract

Over the past decade, stochastic algorithms have emerged as scalable and efficient tools for solving large-scale ill-posed inverse problems by randomly selecting subsets of equations at each iteration. However, due to the ill-posedness and measurement noise, these methods often suffer from oscillations and semi-convergence behavior, posing challenges in achieving stable and accurate reconstructions. This study proposes a novel variant of the stochastic gradient descent (SGD) approach, the iteratively regularized stochastic gradient descent (IRSGD) to address nonlinear ill-posed problems in Hilbert spaces. Under standard assumptions, we demonstrate that the mean square iteration error of the method tends to zero for exact data. In the presence of noisy data, we first propose a heuristic parameter choice rule (HPCR) and then apply the IRSGD method in combination with HPCR. Precisely, HPCR selects the regularization parameter without requiring any a-priori information of the noise level. We show that the method terminatesinfinitelymanystepsincaseofnoisydataandhasregularizingfeatures. Finally, some numerical experiments are performed to demonstrate the practical efficacy of the method.

Hanke-Raus heuristic rule for iteratively regularized stochastic gradient descent | SummarXiv | SummarXiv