Entropy and Learning of Lipschitz Functions under Log-Concave Measures
Pierre Bizeul, Boaz Klartag
公開日: 2025/9/12
Abstract
We study regression of $1$-Lipschitz functions under a log-concave measure $\mu$ on $\mathbb{R}^d$. We focus on the high-dimensional regime where the sample size $n$ is subexponential in $d$, in which distribution-free estimators are ineffective. We analyze two polynomial-based procedures: the projection estimator, which relies on knowledge of an orthogonal polynomial basis of $\mu$, and the least-squares estimator over low-degree polynomials, which requires no knowledge of $\mu$ whatsoever. Their risk is governed by the rate of polynomial approximation of Lipschitz functions in $L^2(\mu)$. When this rate matches the Gaussian one, we show that both estimators achieve minimax bounds over a wide range of parameters. A key ingredient is sharp entropy estimates for the class of $1$-Lipschitz functions in $L^2(\mu)$, which are new even in the Gaussian setting.