An SDP Relaxation for the Sparse Integer Least Squares Problem (2203.02607v3)
Abstract: In this paper, we study the \emph{sparse integer least squares problem} (SILS), an NP-hard variant of least squares with sparse ${0, \pm 1}$-vectors. We propose an $\ell_1$-based SDP relaxation, and a randomized algorithm for SILS, which computes feasible solutions with high probability with an asymptotic approximation ratio $1/T2$ as long as the sparsity constant $\sigma \ll T$. Our algorithm handles large-scale problems, delivering high-quality approximate solutions for dimensions up to $d = 10,000$. The proposed randomized algorithm applies broadly to binary quadratic programs with a cardinality constraint, even for non-convex objectives. For fixed sparsity, we provide sufficient conditions for our SDP relaxation to solve SILS, meaning that any optimal solution to the SDP relaxation yields an optimal solution to SILS. The class of data input which guarantees that SDP solves SILS is broad enough to cover many cases in real-world applications, such as privacy preserving identification and multiuser detection. We validate these conditions in two application-specific cases: the \emph{feature extraction problem}, where our relaxation solves the problem for sub-Gaussian data with weak covariance conditions, and the \emph{integer sparse recovery problem}, where our relaxation solves the problem in both high and low coherence settings under certain conditions.