A preconditioned third-order implicit-explicit algorithm with a difference of varying convex functions and extrapolation

Kelin Wu, Hongpeng Sun

公開日: 2025/9/11

Abstract

This paper proposes a novel preconditioned implicit-explicit algorithm enhanced with the extrapolation technique for non-convex optimization problems. The algorithm employs a third-order Adams-Bashforth scheme for the nonlinear and explicit parts and a third-order backward differentiation formula for the implicit part of the gradient flow in variational functions. The proposed algorithm, akin to a generalized difference-of-convex (DC) approach, employs a changing set of convex functions in each iteration. Under the Kurdyka-\L ojasiewicz (KL) properties, the global convergence of the algorithm is guaranteed, ensuring that it converges within a finite number of preconditioned iterations. Our numerical experiments, including least squares problems with SCAD regularization and the graphical Ginzburg-Landau model, demonstrate the proposed algorithm's highly efficient performance compared to conventional DC algorithms.

A preconditioned third-order implicit-explicit algorithm with a difference of varying convex functions and extrapolation | SummarXiv | SummarXiv