Conditional Independence of 1D Gibbs States with Applications to Efficient Learning

Álvaro M. Alhambra, Ángela Capel, Paul Gondolf, Alberto Ruiz-de-Alarcón, Samuel O. Scalet

公開日: 2024/2/28

Abstract

We show that spin chains in thermal equilibrium have a correlation structure in which individual regions are strongly correlated at most with their near vicinity. We quantify this with alternative notions of the conditional mutual information, defined through the so-called Belavkin-Staszewski relative entropy. We prove that these measures decay superexponentially at every positive temperature, under the assumption that the spin chain Hamiltonian is translation-invariant. Using a recovery map associated with these measures, we sequentially construct tensor network approximations in terms of marginals of small (sublogarithmic) size. As a main application, we show that classical representations of the states can be learned efficiently from local measurements with a polynomial sample complexity. We also prove an approximate factorization condition for the purity of the entire Gibbs state, which implies that it can be efficiently estimated to a small multiplicative error from a small number of local measurements. The results extend from strictly local to exponentially-decaying interactions above a threshold temperature, albeit only with exponential decay rates. As a technical step of independent interest, we show an upper bound to the decay of the Belavkin-Staszewski relative entropy upon the application of a conditional expectation.

Conditional Independence of 1D Gibbs States with Applications to Efficient Learning | SummarXiv | SummarXiv