Emergent Mind

$\ell_p$-Spread and Restricted Isometry Properties of Sparse Random Matrices

(2108.13578)
Published Aug 31, 2021 in cs.CC , math.FA , and math.PR

Abstract

Random subspaces $X$ of $\mathbb{R}n$ of dimension proportional to $n$ are, with high probability, well-spread with respect to the $\ell2$-norm. Namely, every nonzero $x \in X$ is "robustly non-sparse" in the following sense: $x$ is $\varepsilon |x|2$-far in $\ell2$-distance from all $\delta n$-sparse vectors, for positive constants $\varepsilon, \delta$ bounded away from $0$. This "$\ell2$-spread" property is the natural counterpart, for subspaces over the reals, of the minimum distance of linear codes over finite fields, and corresponds to $X$ being a Euclidean section of the $\ell1$ unit ball. Explicit $\ell2$-spread subspaces of dimension $\Omega(n)$, however, are unknown, and the best known constructions (which achieve weaker spread properties), are analogs of low density parity check (LDPC) codes over the reals, i.e., they are kernels of sparse matrices. We study the spread properties of the kernels of sparse random matrices. Rather surprisingly, we prove that with high probability such subspaces contain vectors $x$ that are $o(1)\cdot |x|2$-close to $o(n)$-sparse with respect to the $\ell2$-norm, and in particular are not $\ell2$-spread. On the other hand, for $p < 2$ we prove that such subspaces are $\ellp$-spread with high probability. Moreover, we show that a random sparse matrix has the stronger restricted isometry property (RIP) with respect to the $\ellp$ norm, and this follows solely from the unique expansion of a random biregular graph, yielding a somewhat unexpected generalization of a similar result for the $\ell1$ norm [BGI+08]. Instantiating this with explicit expanders, we obtain the first explicit constructions of $\ellp$-RIP matrices for $1 \leq p < p0$, where $1 < p_0 < 2$ is an absolute constant.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.