Faster Linear Algebra Algorithms with Structured Random Matrices
Chris Camaño, Ethan N. Epperly, Raphael A. Meyer, Joel A. Tropp
Published: 2025/8/28
Abstract
To achieve the greatest possible speed, practitioners regularly implement randomized algorithms for low-rank approximation and least-squares regression with structured dimension reduction maps. Despite significant research effort, basic questions remain about the design and analysis of randomized linear algebra algorithms that employ structured random matrices. This paper develops a new perspective on structured dimension reduction, based on the oblivious subspace injection (OSI) property. The OSI property is a relatively weak assumption on a random matrix that holds when the matrix preserves the length of vectors on average and, with high probability, does not annihilate any vector in a low-dimensional subspace. With the OSI abstraction, the analysis of a randomized linear algebra algorithm factors into two parts: (i) proving that the algorithm works when implemented with an OSI; and (ii) proving that a given random matrix model has the OSI property. This paper develops both parts of the program. First, it analyzes standard randomized algorithms for low-rank approximation and least-squares regression under the OSI assumption. Second, it identifies many examples of OSIs, including random sparse matrices, randomized trigonometric transforms, and random matrices with tensor product structure. These theoretical results imply faster, near-optimal runtimes for several fundamental linear algebra tasks. The paper also provides guidance on implementation, along with empirical evidence that structured random matrices offer exemplary performance for a range of synthetic problems and contemporary scientific applications.