Learning the Maximum of a Hölder Function from Inexact Data
Simon Foucart
Published: 2025/9/20
Abstract
Within the theoretical framework of Optimal Recovery, one determines in this article the {\em best} procedures to learn a quantity of interest depending on a H\"older function acquired via inexact point evaluations at fixed datasites. {\em Best} here refers to procedures minimizing worst-case errors. The elementary arguments hint at the possibility of tackling nonlinear quantities of interest, with a particular focus on the function maximum. In a local setting, i.e., for a fixed data vector, the optimal procedure (outputting the so-called Chebyshev center) is precisely described relatively to a general model of inexact evaluations. Relatively to a slightly more restricted model and in a global setting, i.e., uniformly over all data vectors, another optimal procedure is put forward, showing how to correct the natural underestimate that simply returns the data vector maximum. Jitterred data are also briefly discussed as a side product of evaluating the minimal worst-case error optimized over the datasites.