Emergent Mind

Preserving Injectivity under Subgaussian Mappings and Its Application to Compressed Sensing

(1710.09972)
Published Oct 27, 2017 in cs.IT , math.FA , and math.IT

Abstract

The field of compressed sensing has become a major tool in high-dimensional analysis, with the realization that vectors can be recovered from relatively very few linear measurements as long as the vectors lie in a low-dimensional structure, typically the vectors that are zero in most coordinates with respect to a basis. However, there are many applications where we instead want to recover vectors that are sparse with respect to a dictionary rather than a basis. That is, we assume the vectors are linear combinations of at most $s$ columns of a $d \times n$ matrix $\mathbf{D}$, where $s$ is very small relative to $n$ and the columns of $\mathbf{D}$ form a (typically overcomplete) spanning set. In this direction, we show that as a matrix $\mathbf{D}$ stays bounded away from zero in norm on a set $S$ and a provided map ${\boldsymbol \Phi}$ comprised of i.i.d. subgaussian rows has number of measurements at least proportional to the square of $w(\mathbf{D}S)$, the Gaussian width of the related set $\mathbf{D}S$, then with high probability the composition ${\boldsymbol \Phi} \mathbf{D}$ also stays bounded away from zero. As a specific application, we obtain that the null space property of order $s$ is preserved under such subgaussian maps with high probability. Consequently, we obtain stable recovery guarantees for dictionary-sparse signals via the $\ell_1$-synthesis method with only $O(s\log(n/s))$ random measurements and a minimal condition on $\mathbf{D}$, which complements the compressed sensing literature.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.