A theoretical basis for model collapse in recursive training
Vivek Shripad Borkar
Published: 2025/6/11
Abstract
It is known that recursive training from generative models can lead to the so called `collapse' of the simulated probability distribution. This note shows that one in fact gets two different asymptotic behaviours depending on whether an external source, howsoever minor, is also contributing samples.