Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
37 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
37 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
10 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

An SDP Relaxation for the Sparse Integer Least Squares Problem (2203.02607v3)

Published 4 Mar 2022 in math.OC and cs.DM

Abstract: In this paper, we study the \emph{sparse integer least squares problem} (SILS), an NP-hard variant of least squares with sparse ${0, \pm 1}$-vectors. We propose an $\ell_1$-based SDP relaxation, and a randomized algorithm for SILS, which computes feasible solutions with high probability with an asymptotic approximation ratio $1/T2$ as long as the sparsity constant $\sigma \ll T$. Our algorithm handles large-scale problems, delivering high-quality approximate solutions for dimensions up to $d = 10,000$. The proposed randomized algorithm applies broadly to binary quadratic programs with a cardinality constraint, even for non-convex objectives. For fixed sparsity, we provide sufficient conditions for our SDP relaxation to solve SILS, meaning that any optimal solution to the SDP relaxation yields an optimal solution to SILS. The class of data input which guarantees that SDP solves SILS is broad enough to cover many cases in real-world applications, such as privacy preserving identification and multiuser detection. We validate these conditions in two application-specific cases: the \emph{feature extraction problem}, where our relaxation solves the problem for sub-Gaussian data with weak covariance conditions, and the \emph{integer sparse recovery problem}, where our relaxation solves the problem in both high and low coherence settings under certain conditions.

Citations (1)

Summary

We haven't generated a summary for this paper yet.