Small-Covariance Noise-to-State Stability of Stochastic Systems and Its Applications to Stochastic Gradient Dynamics
Leilei Cui, Zhong-Ping Jiang, Eduardo D. Sontag
公開日: 2025/9/29
Abstract
This paper studies gradient dynamics subject to additive random noise, which may arise from sources such as stochastic gradient estimation, measurement noise, or stochastic sampling errors. To analyze the robustness of such stochastic gradient systems, the concept of small-covariance noise-to-state stability (NSS) is introduced, along with a Lyapunov-based characterization. Furthermore, the classical Polyak-Lojasiewicz (PL) condition on the objective function is generalized to the $\mathcal{K}$-PL condition via comparison functions, thereby extending its applicability to a broader class of optimization problems. It is shown that the stochastic gradient dynamics exhibit small-covariance NSS if the objective function satisfies the $\mathcal{K}$-PL condition and possesses a globally Lipschitz continuous gradient. This result implies that the trajectories of stochastic gradient dynamics converge to a neighborhood of the optimum with high probability, with the size of the neighborhood determined by the noise covariance. Moreover, if the $\mathcal{K}$-PL condition is strengthened to a $\mathcal{K}_\infty$-PL condition, the dynamics are NSS; whereas if it is weakened to a general positive-definite-PL condition, the dynamics exhibit integral NSS. The results further extend to objectives without globally Lipschitz gradients through appropriate step-size tuning. The proposed framework is further applied to the robustness analysis of policy optimization for the linear quadratic regulator (LQR) and logistic regression.