Effective continuous equations for adaptive SGD: a stochastic analysis view

Luca Callisti, Marco Romito, Francesco Triggiano

公開日: 2025/9/25

Abstract

We present a theoretical analysis of some popular adaptive Stochastic Gradient Descent (SGD) methods in the small learning rate regime. Using the stochastic modified equations framework introduced by Li et al., we derive effective continuous stochastic dynamics for these methods. Our key contribution is that sampling-induced noise in SGD manifests in the limit as independent Brownian motions driving the parameter and gradient second momentum evolutions. Furthermore, extending the approach of Malladi et al., we investigate scaling rules between the learning rate and key hyperparameters in adaptive methods, characterising all non-trivial limiting dynamics.

Effective continuous equations for adaptive SGD: a stochastic analysis view | SummarXiv | SummarXiv