Federated Foundation Models in Harsh Wireless Environments: Prospects, Challenges, and Future Directions
Evan Chen, Seyyedali Hosseinalipour, Christopher G. Brinton, David J. Love
Published: 2025/9/2
Abstract
Foundation models (FMs) have shown remarkable capabilities in generalized intelligence, multimodal understanding, and adaptive learning across a wide range of domains. However, their deployment in harsh or austere environments -- characterized by intermittent connectivity, limited computation, noisy data, and dynamically changing network topologies -- remains an open challenge. Existing distributed learning methods such as federated learning (FL) struggle to adapt in such settings due to their reliance on stable infrastructure, synchronized updates, and resource-intensive training. In this work, we explore the potential of Federated Foundation Models (FFMs) as a promising paradigm to address these limitations. By integrating the scalability and generalization power of FMs with novel decentralized, communication-aware FL frameworks, we aim to enable robust, energy-efficient, and adaptive intelligence in extreme and adversarial conditions. We present a detailed breakdown of system-level constraints in harsh environments, and discuss the open research challenges in communication design, model robustness, and energy-efficient personalization for these unique settings.