Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Optimality of $\ell_2/\ell_1$-optimization block-length dependent thresholds (1304.0001v1)

Published 29 Mar 2013 in cs.IT, math.IT, and math.OC

Abstract: The recent work of \cite{CRT,DonohoPol} rigorously proved (in a large dimensional and statistical context) that if the number of equations (measurements in the compressed sensing terminology) in the system is proportional to the length of the unknown vector then there is a sparsity (number of non-zero elements of the unknown vector) also proportional to the length of the unknown vector such that $\ell_1$-optimization algorithm succeeds in solving the system. In more papers \cite{StojnicCSetamBlock09,StojnicICASSP09block,StojnicJSTSP09} we considered under-determined systems with the so-called \textbf{block}-sparse solutions. In a large dimensional and statistical context in \cite{StojnicCSetamBlock09} we determined lower bounds on the values of allowable sparsity for any given number (proportional to the length of the unknown vector) of equations such that an $\ell_2/\ell_1$-optimization algorithm succeeds in solving the system. These lower bounds happened to be in a solid numerical agreement with what one can observe through numerical experiments. Here we derive the corresponding upper bounds. Moreover, the upper bounds that we obtain in this paper match the lower bounds from \cite{StojnicCSetamBlock09} and ultimately make them optimal.

Citations (3)

Summary

We haven't generated a summary for this paper yet.