Haussdorff consistency of MLE in folded normal and Gaussian mixtures

Koustav Mallik

公開日: 2025/8/25

Abstract

We develop a constant-tracking likelihood theory for two nonregular models: the folded normal and finite Gaussian mixtures. For the folded normal, we prove boundary coercivity for the profiled likelihood, show that the profile path of the location parameter exists and is strictly decreasing by an implicit-function argument, and establish a unique profile maximizer in the scale parameter. Deterministic envelopes for the log-likelihood, the score, and the Hessian yield elementary uniform laws of large numbers with finite-sample bounds, avoiding covering numbers. Identification and Kullback-Leibler separation deliver consistency. A sixth-order expansion of the log hyperbolic cosine creates a quadratic-minus-quartic contrast around zero, leading to a nonstandard one-fourth-power rate for the location estimator at the kink and a standard square-root rate for the scale estimator, with a uniform remainder bound. For finite Gaussian mixtures with distinct components and positive weights, we give a short identifiability proof up to label permutations via Fourier and Vandermonde ideas, derive two-sided Gaussian envelopes and responsibility-based gradient bounds on compact sieves, and obtain almost-sure and high-probability uniform laws with explicit constants. Using a minimum-matching distance on permutation orbits, we prove Hausdorff consistency on fixed and growing sieves. We quantify variance-collapse spikes via an explicit spike-bonus bound and show that a quadratic penalty in location and log-scale dominates this bonus, making penalized likelihood coercive; when penalties shrink but sample size times penalty diverges, penalized estimators remain consistent. All proofs are constructive, track constants, verify measurability of maximizers, and provide practical guidance for tuning sieves, penalties, and EM-style optimization.