Transport alpha divergences
Wuchen Li
Published: 2025/4/18
Abstract
We derive a class of divergences measuring the difference between probability density functions on a one-dimensional sample space. This divergence is a one-parameter variation of the {Itakura--Saito} divergence between quantile density functions. We prove that the proposed divergence is one-parameter variation of transport Kullback-Leibler divergence and Hessian distance of negative Boltzmann entropy with respect to Wasserstein-2 metric. From Taylor expansions, we also formulate the $3$-symmetric tensor in Wasserstein space, which is given by an iterative Gamma three operators. The alpha-geodesic on Wasserstein space is also derived. From these properties, we name the proposed information measures transport alpha divergences. We provide several examples of transport alpha divergences for generative models in machine learning applications.