Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds

Computer Science – Information Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Recovery of the sparsity pattern (or support) of a sparse vector from a small number of noisy linear projections (or samples) is a common problem that arises in signal processing and statistics. In this paper, the high-dimensional setting is considered. It is shown that if the sampling rate and per-sample signal-to-noise ratio (SNR) are finite constants independent of the length of the vector, then the optimal sparsity pattern estimate will have a constant fraction of errors. Lower bounds on the sampling rate needed to attain a desired fraction of errors are given in terms of the SNR and various key parameters of the unknown vector. The tightness of the bounds in a scaling sense, as a function of the SNR and the fraction of errors, is established by comparison with existing achievable bounds. Near optimality is shown for a wide variety of practically motivated signal models.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-260285

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.