Some Applications and Limitations of Convex Optimization Hierarchies for Discrete and Continuous Optimization Problems

Mrinalkanti Ghosh

Published: 2025/8/29

Abstract

This thesis explores algorithmic applications and limitations of convex relaxation hierarchies for approximating some discrete and continuous optimization problems. - We show a dichotomy of approximability of constraint satisfaction problems (CSPs) by linear programming (LP) relaxations: for every CSP, the approximation obtained by a basic LP relaxation, is no weaker than the approximation obtained using relaxations given by super-constant levels of the Sherali-Adams hierarchy on instances of size $n$. - For the problem of approximating the absolute maximum of an n-variate degree-d homogeneous polynomial f with real coefficients over the unit sphere, we analyze the optimum value of the level-t sum-of-squares (SoS) SDP relaxation of the problem. Our results offer a trade-off between the approximation ratio and running time, which can take advantage of additional structure in the polynomial, such as non-negativity or sparsity of the coefficients. - We study the problem of approximating the $p \to q$-norm of a matrix $A$, and prove the first NP-hardness result for approximating norms in the hypercontractive case $1< p < q < \infty$. We also prove almost tight algorithmic results for the case when $p \geq q$ (with $2 \in [q,p]$) where constant factor approximations for the matrix norms are possible. A common theme for these results is their connection to geometry. For the discrete optimization problem of CSP, geometry appears as a crucial tool for our lower bound proof. For the problem of polynomial optimization, we show that SDPs capture and extend earlier algorithms based on diameter estimation for convex bodies. For the matrix (operator) norm problem, the definition itself is geometric in nature and embedding theorems play a crucial role in our proofs.