Protected Probabilistic Classification Library

Ivan Petej

Published: 2025/9/14

Abstract

This paper introduces a new Python package specifically designed to address calibration of probabilistic classifiers under dataset shift. The method is demonstrated in binary and multi-class settings and its effectiveness is measured against a number of existing post-hoc calibration methods. The empirical results are promising and suggest that our technique can be helpful in a variety of settings for batch and online learning classification problems where the underlying data distribution changes between the training and test sets.

Protected Probabilistic Classification Library | SummarXiv | SummarXiv