Emergent Mind

Stable Recovery of Sparse Signals via $l_p-$Minimization

(1406.4328)
Published Jun 17, 2014 in cs.IT and math.IT

Abstract

In this paper, we show that, under the assumption that $|\e|2\leq \epsilon$, every $k-$sparse signal $\x\in \mathbb{R}n$ can be stably ($\epsilon\neq0$) or exactly recovered ($\epsilon=0$) from $\y=\A\x+\e$ via $lp-$mnimization with $p\in(0, \bar{p}]$, where \beqnn \bar{p}= \begin{cases} \frac{50}{31}(1-\delta{2k}), &\delta{2k}\in[\frac{\sqrt{2}}{2}, 0.7183)\cr 0.4541, &\delta{2k}\in[0.7183,0.7729)\cr 2(1-\delta{2k}), &\delta{2k}\in[0.7729,1) \end{cases}, \eeqnn even if the restricted isometry constant of $\A$ satisfies $\delta{2k}\in[\frac{\sqrt{2}}{2}, 1)$. Furthermore, under the assumption that $n\leq 4k$, we show that the range of $p$ can be further improved to $p\in(0,\frac{3+2\sqrt{2}}{2}(1-\delta{2k})]$. This not only extends some discussions of only the noiseless recovery (Lai et al. and Wu et al.) to the noise recovery, but also greatly improves the best existing results where $p\in(0,\min{1, 1.0873(1-\delta{2k}) })$ (Wu et al.).

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.