Emergent Mind

On the Scaling Law for Compressive Sensing and its Applications

(1010.2236)
Published Oct 11, 2010 in cs.IT and math.IT

Abstract

$\ell1$ minimization can be used to recover sufficiently sparse unknown signals from compressed linear measurements. In fact, exact thresholds on the sparsity (the size of the support set), under which with high probability a sparse signal can be recovered from i.i.d. Gaussian measurements, have been computed and are referred to as "weak thresholds" \cite{D}. It was also known that there is a tradeoff between the sparsity and the $\ell1$ minimization recovery stability. In this paper, we give a \emph{closed-form} characterization for this tradeoff which we call the scaling law for compressive sensing recovery stability. In a nutshell, we are able to show that as the sparsity backs off $\varpi$ ($0<\varpi<1$) from the weak threshold of $\ell1$ recovery, the parameter for the recovery stability will scale as $\frac{1}{\sqrt{1-\varpi}}$. Our result is based on a careful analysis through the Grassmann angle framework for the Gaussian measurement matrix. We will further discuss how this scaling law helps in analyzing the iterative reweighted $\ell1$ minimization algorithms. If the nonzero elements over the signal support follow an amplitude probability density function (pdf) $f(\cdot)$ whose $t$-th derivative $f{t}(0) \neq 0$ for some integer $t \geq 0$, then a certain iterative reweighted $\ell1$ minimization algorithm can be analytically shown to lift the phase transition thresholds (weak thresholds) of the plain $\ell1$ minimization algorithm.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.