research

Lossless Linear Analog Compression

Abstract

We establish the fundamental limits of lossless linear analog compression by considering the recovery of random vectors xRm{\boldsymbol{\mathsf{x}}}\in{\mathbb R}^m from the noiseless linear measurements y=Ax{\boldsymbol{\mathsf{y}}}=\boldsymbol{A}{\boldsymbol{\mathsf{x}}} with measurement matrix ARn×m\boldsymbol{A}\in{\mathbb R}^{n\times m}. Specifically, for a random vector xRm{\boldsymbol{\mathsf{x}}}\in{\mathbb R}^m of arbitrary distribution we show that x{\boldsymbol{\mathsf{x}}} can be recovered with zero error probability from n>infdimMB(U)n>\inf\underline{\operatorname{dim}}_\mathrm{MB}(U) linear measurements, where dimMB()\underline{\operatorname{dim}}_\mathrm{MB}(\cdot) denotes the lower modified Minkowski dimension and the infimum is over all sets URmU\subseteq{\mathbb R}^{m} with P[xU]=1\mathbb{P}[{\boldsymbol{\mathsf{x}}}\in U]=1. This achievability statement holds for Lebesgue almost all measurement matrices A\boldsymbol{A}. We then show that ss-rectifiable random vectors---a stochastic generalization of ss-sparse vectors---can be recovered with zero error probability from n>sn>s linear measurements. From classical compressed sensing theory we would expect nsn\geq s to be necessary for successful recovery of x{\boldsymbol{\mathsf{x}}}. Surprisingly, certain classes of ss-rectifiable random vectors can be recovered from fewer than ss measurements. Imposing an additional regularity condition on the distribution of ss-rectifiable random vectors x{\boldsymbol{\mathsf{x}}}, we do get the expected converse result of ss measurements being necessary. The resulting class of random vectors appears to be new and will be referred to as ss-analytic random vectors

    Similar works