Self-organized learning emerges from coherent coupling of critical neurons

Chuanbo Liu, Jin Wang

公開日: 2025/8/28

Abstract

Deep artificial neural networks have surpassed human-level performance across a diverse array of complex learning tasks, establishing themselves as indispensable tools in both social applications and scientific research. Despite these advances, the underlying mechanisms of training in artificial neural networks remain elusive. Here, we propose that artificial neural networks function as adaptive, self-organizing information processing systems in which training is mediated by the coherent coupling of strongly activated, task-specific critical neurons. We demonstrate that such neuronal coupling gives rise to Hebbian-like neural correlation graphs, which undergo a dynamic, second-order connectivity phase transition during the initial stages of training. Concurrently, the connection weights among critical neurons are consistently reinforced while being simultaneously redistributed in a stochastic manner. As a result, a precise balance of neuronal contributions is established, inducing a local concentration within the random loss landscape which provides theoretical explanation for generalization capacity. We further identify a later on convergence phase transition characterized by a phase boundary in hyperparameter space, driven by the nonequilibrium probability flux through weight space. The critical computational graphs resulting from coherent coupling also decode the predictive rules learned by artificial neural networks, drawing analogies to avalanche-like dynamics observed in biological neural circuits. Our findings suggest that the coherent coupling of critical neurons and the ensuing local concentration within the loss landscapes may represent universal learning mechanisms shared by both artificial and biological neural computation.

Self-organized learning emerges from coherent coupling of critical neurons | SummarXiv | SummarXiv