Diabatic quantum annealing for training energy-based generative models

Gilhan Kim, Ju-Yeon Ghym, Daniel K. Park

公開日: 2025/9/11

Abstract

Energy-based generative models, such as restricted Boltzmann machines (RBMs), require unbiased Boltzmann samples for effective training. Classical Markov chain Monte Carlo methods, however, converge slowly and yield correlated samples, making large-scale training difficult. We address this bottleneck by applying the analytic relation between annealing schedules and effective inverse temperature in diabatic quantum annealing. By implementing this prescription on a quantum annealer, we obtain temperature-controlled Boltzmann samples that enable RBM training with faster convergence and lower validation error than classical sampling. We further identify a systematic temperature misalignment intrinsic to analog quantum computers and propose an analytical rescaling method that mitigates this hardware noise, thereby enhancing the practicality of quantum annealers as Boltzmann samplers. In our method, the model's connectivity is set directly by the qubit connectivity, transforming the computational complexity inherent in classical sampling into a requirement on quantum hardware. This shift allows the approach to extend naturally from RBMs to fully connected Boltzmann machines, opening opportunities inaccessible to classical training methods.