The Good, the Bad, and the Ugly of Atomistic Learning for "Clusters-to-Bulk" Generalization

Mikołaj J. Gawkowski, Mingjia Li, Benjamin X. Shi, Venkat Kapil

Published: 2025/9/20

Abstract

Training machine learning interatomic potentials (MLIPs) on total energies of molecular clusters using differential or transfer learning is becoming a popular route to extend the accuracy of correlated wave-function theory to condensed phases. A key challenge, however, lies in validation, as reference observables in finite-temperature ensembles are not available at the reference level. Here, we construct synthetic reference data from pretrained MLIPs and evaluate the generalizability of cluster-trained models on ice-Ih, considering scenarios where both energies and forces and where only energies are available for training. We study the accuracy and data-efficiency of differential, single-fidelity transfer, and multi-fidelity transfer learning against ground-truth thermodynamic observables. We find that transferring accuracy from clusters to bulk requires regularization, which is best achieved through multi-fidelity transfer learning when training on both energies and forces. By contrast, training only on energies introduces artefacts: stable trajectories and low energy errors conceal large force errors, leading to inaccurate microscopic observables. More broadly, we show that accurate reproduction of microscopic structure correlates strongly with low force errors but only weakly with energy errors, whereas global properties such as energies and densities correlate with low energy errors. This highlights the need to incorporate forces during training or to apply careful validation before production. Our results highlight the promise and pitfalls of cluster-trained MLIPs for condensed phases and provide guidelines for developing - and critically, validating - robust and data-efficient MLIPs.