Parametric convergence rate of a non-parametric estimator in multivariate mixtures of power series distributions under conditional independence
Fadoua Balabdaoui, Harald Besdziek, Yong Wang
Published: 2025/9/5
Abstract
The conditional independence assumption has recently appeared in a growing body of literature on the estimation of multivariate mixtures. We consider here conditionally independent multivariate mixtures of power series distributions with infinite support, to which belong Poisson, Geometric or Negative Binomial mixtures. We show that for all these mixtures, the non-parametric maximum likelihood estimator converges to the truth at the rate $(\log (nd))^{1+d/2} n^{-1/2}$ in the Hellinger distance, where $n$ denotes the size of the observed sample and $d$ represents the dimension of the mixture. Using this result, we then construct a new non-parametric estimator based on the maximum likelihood estimator that converges with the parametric rate $n^{-1/2}$ in all $\ell_p$-distances, for $p \ge 1$. These convergences rates are supported by simulations and the theory is illustrated using the famous V\'{e}lib dataset of the bike sharing system of Paris. We also introduce a testing procedure for whether the conditional independence assumption is satisfied for a given sample. This testing procedure is applied for several multivariate mixtures, with varying levels of dependence, and is thereby shown to distinguish well between conditionally independent and dependent mixtures. Finally, we use this testing procedure to investigate whether conditional independence holds for V\'{e}lib dataset.