To Remember, To Adapt, To Preempt: A Stable Continual Test-Time Adaptation Framework for Remote Physiological Measurement in Dynamic Domain Shifts
Shuyang Chu, Jingang Shi, Xu Cheng, Haoyu Chen, Xin Liu, Jian Xu, Guoying Zhao
Published: 2025/9/30
Abstract
Remote photoplethysmography (rPPG) aims to extract non-contact physiological signals from facial videos and has shown great potential. However, existing rPPG approaches struggle to bridge the gap between source and target domains. Recent test-time adaptation (TTA) solutions typically optimize rPPG model for the incoming test videos using self-training loss under an unrealistic assumption that the target domain remains stationary. However, time-varying factors like weather and lighting in dynamic environments often cause continual domain shifts. The erroneous gradients accumulation from these shifts may corrupt the model's key parameters for physiological information, leading to catastrophic forgetting. Therefore, We propose a physiology-related parameters freezing strategy to retain such knowledge. It isolates physiology-related and domain-related parameters by assessing the model's uncertainty to current domain and freezes the physiology-related parameters during adaptation to prevent catastrophic forgetting. Moreover, the dynamic domain shifts with various non-physiological characteristics may lead to conflicting optimization objectives during TTA, which is manifested as the over-adapted model losing its adaptability to future domains. To fix over-adaptation, we propose a preemptive gradient modification strategy. It preemptively adapts to future domains and uses the acquired gradients to modify current adaptation, thereby preserving the model's adaptability. In summary, we propose a stable continual test-time adaptation (CTTA) framework for rPPG measurement, called \textbf{PhysRAP}, which \textbf{R}emembers the past, \textbf{A}dapts to the present, and \textbf{P}reempts the future. Extensive experiments show its state-of-the-art performance, especially in domain shifts. The code is available at https://github.com/xjtucsy/PhysRAP.