Computer Science – Information Theory
Scientific paper
2010-02-24
Computer Science
Information Theory
Scientific paper
Recovery of the sparsity pattern (or support) of a sparse vector from a small number of noisy linear projections (or samples) is a common problem that arises in signal processing and statistics. In this paper, the high-dimensional setting is considered. It is shown that if the sampling rate and per-sample signal-to-noise ratio (SNR) are finite constants independent of the length of the vector, then the optimal sparsity pattern estimate will have a constant fraction of errors. Lower bounds on the sampling rate needed to attain a desired fraction of errors are given in terms of the SNR and various key parameters of the unknown vector. The tightness of the bounds in a scaling sense, as a function of the SNR and the fraction of errors, is established by comparison with existing achievable bounds. Near optimality is shown for a wide variety of practically motivated signal models.
Gastpar Michael
Reeves Galen
No associations
LandOfFree
Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-260285