Fine-Tuning Bulk-oriented Universal Interatomic Potentials for Surfaces: Accuracy, Efficiency, and Forgetting Control
Jaekyun Hwang, Taehun Lee, Yonghyuk Lee, Su-Hyun Yoo
Published: 2025/9/30
Abstract
Accurate prediction of surface energies and stabilities is essential for materials design, yet first-principles calculations remain computationally expensive and most existing interatomic potentials are trained only on bulk systems. Here, we demonstrate that fine-tuning foundation machine learning potentials (MLPs) significantly improves both computational efficiency and predictive accuracy for surface modeling. While existing universal interatomic potentials (UIPs) have been solely trained and validated on bulk datasets, we extend their applicability to complex and scientifically significant unary, binary, and ternary surface systems. We systematically compare models trained from scratch, zero-shot inference, conventional fine-tuning, and multi-head fine-tuning approach that enhances transferability and mitigates catastrophic forgetting. Fine-tuning consistently reduces prediction errors with orders-of-magnitude fewer training configurations, and multi-head fine-tuning delivers robust and generalizable predictions even for materials beyond the initial training domain. These findings offer practical guidance for leveraging pre-trained MLPs to accelerate surface modeling and highlight a scalable path toward data-efficient, next-generation atomic-scale simulations in computational materials science.