Computer Science – Information Theory
Scientific paper
2008-04-22
IEEE Transactions on Information Theory, 56, Issue:10, Oct. 2010, 5111-5130
Computer Science
Information Theory
30 pages, 2 figures, submitted to IEEE Trans. on IT
Scientific paper
10.1109/TIT.2010.2059891
In this paper we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. We consider two settings: output noise models where the noise enters after the projection and input noise models where the noise enters before the projection. We consider two types of distortion for reconstruction: support errors and mean-squared errors. Our goal is to relate the number of measurements, $m$, and $\snr$, to signal sparsity, $k$, distortion level, $d$, and signal dimension, $n$. We consider support errors in a worst-case setting. We employ different variations of Fano's inequality to derive necessary conditions on the number of measurements and $\snr$ required for exact reconstruction. To derive sufficient conditions we develop new insights on max-likelihood analysis based on a novel superposition property. In particular this property implies that small support errors are the dominant error events. Consequently, our ML analysis does not suffer the conservatism of the union bound and leads to a tighter analysis of max-likelihood. These results provide order-wise tight bounds. For output noise models we show that asymptotically an $\snr$ of $\Theta(\log(n))$ together with $\Theta(k \log(n/k))$ measurements is necessary and sufficient for exact support recovery. Furthermore, if a small fraction of support errors can be tolerated, a constant $\snr$ turns out to be sufficient in the linear sparsity regime. In contrast for input noise models we show that support recovery fails if the number of measurements scales as $o(n\log(n)/SNR)$ implying poor compression performance for such cases. We also consider Bayesian set-up and characterize tradeoffs between mean-squared distortion and the number of measurements using rate-distortion theory.
Aeron Shuchin
Saligrama Venkatesh
Zhao Manqi
No associations
LandOfFree
Information theoretic bounds for Compressed Sensing does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Information theoretic bounds for Compressed Sensing, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Information theoretic bounds for Compressed Sensing will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-674749