A Gradient Flow Approach to Solving Inverse Problems with Latent Diffusion Models

Tim Y. J. Wang, O. Deniz Akyildiz

Published: 2025/9/23

Abstract

Solving ill-posed inverse problems requires powerful and flexible priors. We propose leveraging pretrained latent diffusion models for this task through a new training-free approach, termed Diffusion-regularized Wasserstein Gradient Flow (DWGF). Specifically, we formulate the posterior sampling problem as a regularized Wasserstein gradient flow of the Kullback-Leibler divergence in the latent space. We demonstrate the performance of our method on standard benchmarks using StableDiffusion (Rombach et al., 2022) as the prior.

A Gradient Flow Approach to Solving Inverse Problems with Latent Diffusion Models | SummarXiv | SummarXiv