Emergent Mind

On the Theorem of Uniform Recovery of Random Sampling Matrices

(1206.5986)
Published Jun 26, 2012 in cs.IT , cs.NA , and math.IT

Abstract

We consider two theorems from the theory of compressive sensing. Mainly a theorem concerning uniform recovery of random sampling matrices, where the number of samples needed in order to recover an $s$-sparse signal from linear measurements (with high probability) is known to be $m\gtrsim s(\ln s)3\ln N$. We present new and improved constants together with what we consider to be a more explicit proof. A proof that also allows for a slightly larger class of $m\times N$-matrices, by considering what we call \emph{low entropy}. We also present an improved condition on the so-called restricted isometry constants, $\deltas$, ensuring sparse recovery via $\ell1$-minimization. We show that $\delta{2s}<4/\sqrt{41}$ is sufficient and that this can be improved further to almost allow for a sufficient condition of the type $\delta_{2s}<2/3$.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.