Characterizing and Minimizing Divergent Delivery in Meta Advertising Experiments
Gordon Burtch, Robert Moakler, Brett R. Gordon, Poppy Zhang, Shawndra Hill
公開日: 2025/8/28
Abstract
Many digital platforms offer advertisers experimentation tools like Meta's Lift and A/B tests to optimize their ad campaigns. Lift tests compare outcomes between users eligible to see ads versus users in a no-ad control group. In contrast, A/B tests compare users exposed to alternative ad configurations, absent any control group. The latter setup raises the prospect of divergent delivery: ad delivery algorithms may target different ad variants to different audience segments. This complicates causal interpretation because results may reflect both ad content effectiveness and changes to audience composition. We offer three key contributions. First, we make clear that divergent delivery is specific to A/B tests and intentional, informing advertisers about ad performance in practice. Second, we measure divergent delivery at scale, considering 3,204 Lift tests and 181,890 A/B tests. Lift tests show no meaningful audience imbalance, confirming their causal validity, while A/B tests show clear imbalance, as expected. Third, we demonstrate that campaign configuration choices can reduce divergent delivery in A/B tests, lessening algorithmic influence on results. While no configuration guarantees eliminating divergent delivery entirely, we offer evidence-based guidance for those seeking more generalizable insights about ad content in A/B tests.