Abstract
We establish the fundamental limits of lossless linear analog compression by considering the recovery of random vectors ${\boldsymbol{\mathsf{x}}}\in{\mathbb R}m$ from the noiseless linear measurements ${\boldsymbol{\mathsf{y}}}=\boldsymbol{A}{\boldsymbol{\mathsf{x}}}$ with measurement matrix $\boldsymbol{A}\in{\mathbb R}{n\times m}$. Specifically, for a random vector ${\boldsymbol{\mathsf{x}}}\in{\mathbb R}m$ of arbitrary distribution we show that ${\boldsymbol{\mathsf{x}}}$ can be recovered with zero error probability from $n>\inf\underline{\operatorname{dim}}\mathrm{MB}(U)$ linear measurements, where $\underline{\operatorname{dim}}\mathrm{MB}(\cdot)$ denotes the lower modified Minkowski dimension and the infimum is over all sets $U\subseteq{\mathbb R}{m}$ with $\mathbb{P}[{\boldsymbol{\mathsf{x}}}\in U]=1$. This achievability statement holds for Lebesgue almost all measurement matrices $\boldsymbol{A}$. We then show that $s$-rectifiable random vectorsa stochastic generalization of $s$-sparse vectorscan be recovered with zero error probability from $n>s$ linear measurements. From classical compressed sensing theory we would expect $n\geq s$ to be necessary for successful recovery of ${\boldsymbol{\mathsf{x}}}$. Surprisingly, certain classes of $s$-rectifiable random vectors can be recovered from fewer than $s$ measurements. Imposing an additional regularity condition on the distribution of $s$-rectifiable random vectors ${\boldsymbol{\mathsf{x}}}$, we do get the expected converse result of $s$ measurements being necessary. The resulting class of random vectors appears to be new and will be referred to as $s$-analytic random vectors.
We're not able to analyze this paper right now due to high demand.
Please check back later (sorry!).
Generate a summary of this paper on our Pro plan:
We ran into a problem analyzing this paper.