Simple and Sharp Generalization Bounds via Lifting
Jingbo Liu
公開日: 2025/8/26
Abstract
We develop an information-theoretic framework for bounding the expected supremum and tail probabilities of stochastic processes, offering a simpler and sharper alternative to classical chaining and slicing arguments for generalization bounds. The key idea is a lifting argument that produces information-theoretic analogues of empirical process bounds, such as Dudley's entropy integral. The lifting introduces symmetry, yielding sharp bounds even when the classical Dudley integral is loose. As a by-product, we obtain a concise proof of the majorizing measure theorem, providing explicit constants. The information-theoretic approach provides a soft version of classical localized complexity bounds in generalization theory, but is more concise and does not require the slicing argument. We apply this approach to empirical risk minimization over Sobolev ellipsoids and weak $\ell_q$ balls, obtaining sharper convergence rates or extensions to settings not covered by classical methods.