Gradient-based grand canonical optimization enabled by graph neural networks with fractional atomic existence

Mads-Peter Verner Christiansen, Bjørk Hammer

Published: 2025/7/25

Abstract

Machine learning interatomic potentials have become an indispensable tool for materials science, enabling the study of larger systems and longer timescales. State-of-the-art models are generally graph neural networks that employ message passing to iteratively update atomic embeddings that are ultimately used for predicting properties. In this work we extend the message passing formalism with the inclusion of a continuous variable that accounts for fractional atomic existence. This allows us to calculate the gradient of the Gibbs free energy with respect to both the Cartesian coordinates of atoms and their existence. Using this we propose a gradient-based grand canonical optimization method and document its capabilities for a Cu(110) surface oxide.

Gradient-based grand canonical optimization enabled by graph neural networks with fractional atomic existence | SummarXiv | SummarXiv