General Divergence Regularized Optimal Transport: Sample Complexity and Central Limit Theorems
Jiaping Yang, Yunxin Zhang
公開日: 2025/10/2
Abstract
Optimal transport has emerged as a fundamental methodology with applications spanning multiple research areas in recent years. However, the convergence rate of the empirical estimator to its population counterpart suffers from the curse of dimensionality, which prevents its application in high-dimensional spaces. While entropic regularization has been proven to effectively mitigate the curse of dimensionality and achieve a parametric convergence rate under mild conditions, these statistical guarantees have not been extended to general regularizers. Our work bridges this gap by establishing analogous results for a broader family of regularizers. Specifically, under boundedness constraints, we prove a convergence rate of order $n^{-1/2} with respect to sample size n. Furthermore, we derive several central limit theorems for divergence regularized optimal transport.