An Efficient Subspace Algorithm for Federated Learning on Heterogeneous Data

Jiaojiao Zhang, Yuqi Xu, Kun Yuan

公開日: 2025/9/5

Abstract

This work addresses the key challenges of applying federated learning to large-scale deep neural networks, particularly the issue of client drift due to data heterogeneity across clients and the high costs of communication, computation, and memory. We propose FedSub, an efficient subspace algorithm for federated learning on heterogeneous data. Specifically, FedSub utilizes subspace projection to guarantee local updates of each client within low-dimensional subspaces, thereby reducing communication, computation, and memory costs. Additionally, it incorporates low-dimensional dual variables to mitigate client drift. We provide convergence analysis that reveals the impact of key factors such as step size and subspace projection matrices on convergence. Experimental results demonstrate its efficiency.

An Efficient Subspace Algorithm for Federated Learning on Heterogeneous Data | SummarXiv | SummarXiv