Lie-algebraic classical simulations for quantum computing

Matthew L. Goh, Martin Larocca, Lukasz Cincio, M. Cerezo, Frédéric Sauvage

公開日: 2023/8/2

Abstract

The classical simulation of quantum dynamics plays an important role in our understanding of quantum complexity, and in the development of quantum technologies. Efficient techniques such as those based on the Gottesman-Knill theorem for Clifford circuits, tensor networks for low entanglement-generating circuits, or Wick's theorem for fermionic Gaussian states, have become central tools in quantum computing. In this work, we contribute to this body of knowledge by presenting a framework for classical simulations, dubbed "$\mathfrak{g}$-sim", which is based on the underlying Lie algebraic structure of the dynamical process. When the dimension of the algebra grows at most polynomially in the system size, there exists observables for which the simulation is efficient. Indeed, we show that $\mathfrak{g}$-sim enables new regimes for classical simulations, is able to deal with certain forms of noise in the evolution, as well as can be used to tackle several paradigmatic variational and non-variational quantum computing tasks. For the former, we perform Lie-algebraic simulations to train and optimize parametrized quantum circuits (thus effectively showing that some variational models can be dequantized), design enhanced parameter initialization strategies, solve tasks of quantum circuit synthesis, and train a quantum-phase classifier. For the latter, we report large-scale noiseless and noisy simulations on benchmark problems. By comparing the limitations of $\mathfrak{g}$-sim and certain Wick's theorem-based simulations, we find that the two methods become inefficient for different types of states or observables, hinting at the existence of distinct, non-equivalent, resources for classical simulation.

Lie-algebraic classical simulations for quantum computing | SummarXiv | SummarXiv