Conformal prediction without knowledge of labeled calibration data
Jonas Flechsig, Maximilian Pilz
Published: 2025/9/12
Abstract
We extend the method of conformal prediction beyond the case relying on labeled calibration data. Replacing the calibration scores by suitable estimates, we identify conformity sets $C$ for classification and regression models that rely on unlabeled calibration data. Given a classification model with accuracy $1-\beta$, we prove that the conformity sets guarantee a coverage of $P(Y \in C) \geq 1-\alpha-\beta$ for an arbitrary parameter $\alpha \in (0,1)$. The same coverage guarantee also holds for regression models, if we replace the accuracy by a similar exactness measure. Finally, we describe how to use the theoretical results in practice.