RSU-Assisted Resource Allocation for Collaborative Perception
Guowei Liu, Le Liang, Chongtao Guo, Hao Ye, Shi Jin
Published: 2025/9/22
Abstract
As a pivotal technology for autonomous driving, collaborative perception enables vehicular agents to exchange perceptual data through vehicle-to-everything (V2X) communications, thereby enhancing perception accuracy of all collaborators. However, existing collaborative perception frameworks often assume ample communication resources, which is usually impractical in real-world vehicular networks. To address this challenge, this paper investigates the problem of communication resource allocation for collaborative perception and proposes RACooper, a novel RSU-assisted resource allocation framework that maximizes perception accuracy under constrained communication resources. RACooper leverages a hierarchical reinforcement learning model to dynamically allocate communication resources while accounting for real-time sensing data and channel dynamics induced by vehicular mobility. By jointly optimizing spatial confidence metrics and channel state information, our approach ensures efficient feature transmission, enhancing the effectiveness of collaborative perception. Simulation results demonstrate that compared to conventional baseline algorithms, RACooper achieves significant improvements in perception accuracy, especially under bandwidth-constrained scenarios.