Monotone Lipschitz-Gradient Denoiser: Explainability of Operator Regularization Approaches Free From Lipschitz Constant Control

Masahiro Yukawa, Isao Yamada

公開日: 2024/6/7

Abstract

This paper addresses explainability of the operator-regularization approach under the use of monotone Lipschitz-gradient (MoL-Grad) denoiser -- an operator that can be expressed as the Lipschitz continuous gradient of a differentiable convex function. We prove that an operator is a MoL-Grad denoiser if and only if it is the ``single-valued'' proximity operator of a weakly convex function. An extension of Moreau's decomposition is also shown with respect to a weakly convex function and the conjugate of its convexified function. Under these arguments, two specific algorithms, the forward-backward splitting algorithm and the primal-dual splitting algorithm, are considered, both employing MoL-Grad denoisers. These algorithms generate a sequence of vectors converging weakly, under conditions, to a minimizer of a certain cost function which involves an ``implicit regularizer'' induced by the denoiser. Unlike the previous studies of operator regularization, our framework requires no control of the Lipschitz constant in learning the denoiser. The theoretical findings are supported by simulations.