Nesterov acceleration for strongly convex-strongly concave bilinear saddle point problems: discrete and continuous-time approaches
Xin He, Ya-Ping Fang
公開日: 2025/9/10
Abstract
In this paper, we study a bilinear saddle point problem of the form $\min_{x}\max_{y} F(x) + \langle Ax, y \rangle - G(y)$, where $F$ and $G$ are $\mu_F$- and $\mu_G$-strongly convex functions, respectively. By incorporating Nesterov acceleration for strongly convex optimization, we first propose an optimal first-order discrete primal-dual gradient algorithm. We show that it achieves the optimal convergence rate $\mathcal{O}\left(\left(1 - \min\left\{\sqrt{\frac{\mu_F}{L_F}}, \sqrt{\frac{\mu_G}{L_G}}\right\}\right)^k\right)$ for both the primal-dual gap and the iterative, where $L_F$ and $L_G$ denote the smoothness constants of $F$ and $G$, respectively. We further develop a continuous-time accelerated primal-dual dynamical system with constant damping. Using the Lyapunov analysis method, we establish the existence and uniqueness of a global solution, as well as the linear convergence rate $\mathcal{O}(e^{-\min\{\sqrt{\mu_F},\sqrt{\mu_G}\}t})$. Notably, when $A = 0$, our methods recover the classical Nesterov accelerated methods for strongly convex unconstrained problems in both discrete and continuous-time. Numerical experiments are presented to support the theoretical convergence rates.