Bridging Control Variates and Regression Adjustment in A/B Testing: From Design-Based to Model-Based Frameworks

Yu Zhang, Bokui Wan, Yongli Qin

Published: 2025/9/17

Abstract

A B testing serves as the gold standard for large scale, data driven decision making in online businesses. To mitigate metric variability and enhance testing sensitivity, control variates and regression adjustment have emerged as prominent variance reduction techniques, leveraging pre experiment data to improve estimator performance. Over the past decade, these methods have spawned numerous derivatives, yet their theoretical connections and comparative properties remain underexplored. In this paper, we conduct a comprehensive analysis of their statistical properties, establish a formal bridge between the two frameworks in practical implementations, and extend the investigation from design based to model-based frameworks. Through simulation studies and real world experiments at ByteDance, we validate our theoretical insights across both frameworks. Our work aims to provide rigorous guidance for practitioners in online controlled experiments, addressing critical considerations of internal and external validity. The recommended method control variates with group specific coefficient estimates has been fully implemented and deployed on ByteDance's experimental platform.