Towards Communication-Efficient Decentralized Federated Graph Learning over Non-IID Data
Shilong Wang, Jianchun Liu, Hongli Xu, Chenxia Tang, Qianpiao Ma, Liusheng Huang
公開日: 2025/9/10
Abstract
Decentralized Federated Graph Learning (DFGL) overcomes potential bottlenecks of the parameter server in FGL by establishing a peer-to-peer (P2P) communication network among workers. However, while extensive cross-worker communication of graph node embeddings is crucial for DFGL training, it introduces substantial communication costs. Most existing works typically construct sparse network topologies or utilize graph neighbor sampling methods to alleviate the communication overhead in DFGL. Intuitively, integrating these methods may offer promise for doubly improving communication efficiency in DFGL. However, our preliminary experiments indicate that directly combining these methods leads to significant training performance degradation if they are jointly optimized. To address this issue, we propose Duplex, a unified framework that jointly optimizes network topology and graph sampling by accounting for their coupled relationship, thereby significantly reducing communication cost while enhancing training performance in DFGL. To overcome practical DFGL challenges, eg, statistical heterogeneity and dynamic network environments, Duplex introduces a learning-driven algorithm to adaptively determine optimal network topologies and graph sampling ratios for workers. Experimental results demonstrate that Duplex reduces completion time by 20.1%--48.8% and communication costs by 16.7%--37.6% to achieve target accuracy, while improving accuracy by 3.3%--7.9% under identical resource budgets compared to baselines.