A Monte Carlo Approach to Nonsmooth Convex Optimization via Proximal Splitting Algorithms

Nicholas Di, Eric C. Chi, Samy Wu Fung

公開日: 2025/9/9

Abstract

Operator splitting algorithms are a cornerstone of modern first-order optimization, relying critically on proximal operators as their fundamental building blocks. However, explicit formulas for proximal operators are available only for limited classes of functions, restricting the applicability of these methods. Recent work introduced HJ-Prox, a zeroth-order Monte Carlo approximation of the proximal operator derived from Hamilton-Jacobi PDEs, which circumvents the need for closed-form solutions. In this work, we extend the scope of HJ-Prox by establishing that it can be seamlessly incorporated into operator splitting schemes while preserving convergence guarantees. In particular, we show that replacing exact proximal steps with HJ-Prox approximations in algorithms such as proximal gradient descent, Douglas-Rachford splitting, Davis-Yin splitting, and the primal-dual hybrid gradient method still ensures convergence under mild conditions.