A Proximal Gradient Method With Probabilistic Multi-Gossip Communications for Decentralized Composite Optimization
Luyao Guo, Luqing Wang, Xinli Shi, Jinde Cao
公開日: 2023/12/19
Abstract
Decentralized optimization methods with local updates have recently gained attention for their provable ability to communication acceleration. In these methods, nodes perform several iterations of local computations between the communication rounds. Nevertheless, this capability is effective only when the network is sufficiently well-connected and the loss function is smooth. In this paper, we propose a communication-efficient method MG-Skip with probabilistic local updates and multi-gossip communications for decentralized composite (smooth + nonsmooth) optimization, whose stepsize is independent of the number of local updates and the network topology. For any undirected and connected networks, MG-Skip allows for the multi-gossip communications to be skipped in most iterations in the strongly convex setting, while its computation complexity is $\mathcal{O}\left(\kappa \log \frac{1}{\epsilon}\right)$ and communication complexity is only $\mathcal{O}\left(\sqrt{\frac{\kappa}{(1-\rho)}} \log \frac{1}{\epsilon}\right)$, where $\kappa$ is the condition number of the loss function, $\rho$ reflects the connectivity of the network topology, and $\epsilon$ is the target accuracy. The theoretical results indicate that MG-Skip achieves provable communication acceleration, thereby validating the advantages of local updates in the nonsmooth setting.