Latent-Space Mean-Field Theory for Deep BitNet-like Training: Constrained Gradient Flows with Smooth Quantization and STE Limits

Dongwon Kim, Dongseok Lee

Published: 2025/8/29

Abstract

This work develops a mean-field analysis for the asymptotic behavior of deep BitNet-like architectures as smooth quantization parameters approach zero. We establish that empirical measures of latent weights converge weakly to solutions of constrained continuity equations under vanishing quantization smoothing. Our main theoretical contribution demonstrates that the natural exponential decay in smooth quantization cancels out apparent singularities, yielding uniform bounds on mean-field dynamics independent of smoothing parameters. Under standard regularity assumptions, we prove convergence to a well-defined limit that provides the mathematical foundation for gradient-based training of quantized neural networks through distributional analysis.

Latent-Space Mean-Field Theory for Deep BitNet-like Training: Constrained Gradient Flows with Smooth Quantization and STE Limits | SummarXiv | SummarXiv