Smooth Trade-off for Tensor PCA via Sharp Bounds for Kikuchi Matrices
Pravesh K. Kothari, Jeff Xu
公開日: 2025/10/3
Abstract
In this work, we revisit algorithms for Tensor PCA: given an order-$r$ tensor of the form $T = G+\lambda \cdot v^{\otimes r}$ where $G$ is a random symmetric Gaussian tensor with unit variance entries and $v$ is an unknown boolean vector in $\{\pm 1\}^n$, what's the minimum $\lambda$ at which one can distinguish $T$ from a random Gaussian tensor and more generally, recover $v$? As a result of a long line of work, we know that for any $\ell \in \N$, there is a $n^{O(\ell)}$ time algorithm that succeeds when the signal strength $\lambda \gtrsim \sqrt{\log n} \cdot n^{-r/4} \cdot \ell^{1/2-r/4}$. The question of whether the logarithmic factor is necessary turns out to be crucial to understanding whether larger polynomial time allows recovering the signal at a lower signal strength. Such a smooth trade-off is necessary for tensor PCA being a candidate problem for quantum speedups[SOKB25]. It was first conjectured by [WAM19] and then, more recently, with an eye on smooth trade-offs, reiterated in a blogpost of Bandeira. In this work, we resolve these conjectures and show that spectral algorithms based on the Kikuchi hierarchy \cite{WAM19} succeed whenever $\lambda \geq \Theta_r(1) \cdot n^{-r/4} \cdot \ell^{1/2-r/4}$ where $\Theta_r(1)$ only hides an absolute constant independent of $n$ and $\ell$. A sharp bound such as this was previously known only for $\ell \leq 3r/4$ via non-asymptotic techniques in random matrix theory inspired by free probability.