Lossless Linear Analog Compression (1605.00912v2)
Abstract: We establish the fundamental limits of lossless linear analog compression by considering the recovery of random vectors ${\boldsymbol{\mathsf{x}}}\in{\mathbb R}m$ from the noiseless linear measurements ${\boldsymbol{\mathsf{y}}}=\boldsymbol{A}{\boldsymbol{\mathsf{x}}}$ with measurement matrix $\boldsymbol{A}\in{\mathbb R}{n\times m}$. Specifically, for a random vector ${\boldsymbol{\mathsf{x}}}\in{\mathbb R}m$ of arbitrary distribution we show that ${\boldsymbol{\mathsf{x}}}$ can be recovered with zero error probability from $n>\inf\underline{\operatorname{dim}}\mathrm{MB}(U)$ linear measurements, where $\underline{\operatorname{dim}}\mathrm{MB}(\cdot)$ denotes the lower modified Minkowski dimension and the infimum is over all sets $U\subseteq{\mathbb R}{m}$ with $\mathbb{P}[{\boldsymbol{\mathsf{x}}}\in U]=1$. This achievability statement holds for Lebesgue almost all measurement matrices $\boldsymbol{A}$. We then show that $s$-rectifiable random vectors---a stochastic generalization of $s$-sparse vectors---can be recovered with zero error probability from $n>s$ linear measurements. From classical compressed sensing theory we would expect $n\geq s$ to be necessary for successful recovery of ${\boldsymbol{\mathsf{x}}}$. Surprisingly, certain classes of $s$-rectifiable random vectors can be recovered from fewer than $s$ measurements. Imposing an additional regularity condition on the distribution of $s$-rectifiable random vectors ${\boldsymbol{\mathsf{x}}}$, we do get the expected converse result of $s$ measurements being necessary. The resulting class of random vectors appears to be new and will be referred to as $s$-analytic random vectors.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.