Transformer-Based Neural Networks Backflow for Strongly Correlated Electronic Structure
Huan Ma, Bowen Kan, Honghui Shang, Jinlong Yang
公開日: 2025/9/30
Abstract
Solving the electronic Schr\"odinger equation for strongly correlated systems remains one of the grand challenges in quantum chemistry. Here we demonstrate that Transformer architectures can be adapted to capture the complex grammar of electronic correlations through neural network backflow. In this approach, electronic configurations are processed as token sequences, where attention layers learn non-local orbital correlations and token-specific neural networks map these contextual representations into backflowed orbitals. Application to strongly correlated iron-sulfur clusters validates our approach: for $\left[\mathrm{Fe}_2 \mathrm{~S}_2\left(\mathrm{SCH}_3\right)_4\right]^{2-}$ ([2Fe-2S]) (30e,20o), the ground-state energy within chemical accuracy of DMRG while predicting magnetic exchange coupling constants closer to experimental values than all compared methods including DMRG, CCSD(T), and recent neural network approaches. For $\left[\mathrm{Fe}_4 \mathrm{S}_4\left(\mathrm{SCH}_3\right)_4\right]^{2-}$ ([4Fe-4S]) (54e,36o), we match DMRG energies and accurately reproduce detailed spin-spin correlation patterns between all Fe centers. The approach scales favorably to large active spaces inaccessible to exact methods, with distributed VMC optimization enabling stable convergence. These results establish Transformer-based backflow as a powerful variational ansatz for strongly correlated electronic structure, achieving superior magnetic property predictions while maintaining chemical accuracy in total energies.