Towards Self-Supervised Foundation Models for Critical Care Time Series

Katja Naasunnguaq Jagd, Rachael DeVries, Ole Winther

公開日: 2025/9/24

Abstract

Domain-specific foundation models for healthcare have expanded rapidly in recent years, yet foundation models for critical care time series remain relatively underexplored due to the limited size and availability of datasets. In this work, we introduce an early-stage pre-trained foundation model for critical care time-series based on the Bi-Axial Transformer (BAT), trained on pooled electronic health record datasets. We demonstrate effective transfer learning by fine-tuning the model on a dataset distinct from the training sources for mortality prediction, where it outperforms supervised baselines, particularly for small datasets ($<5,000$). These contributions highlight the potential of self-supervised foundation models for critical care times series to support generalizable and robust clinical applications in resource-limited settings.

Towards Self-Supervised Foundation Models for Critical Care Time Series | SummarXiv | SummarXiv