Statistics – Machine Learning
Scientific paper
2008-05-20
Statistics
Machine Learning
Scientific paper
We consider the problem of estimating the support of a vector $\beta^* \in \mathbb{R}^{p}$ based on observations contaminated by noise. A significant body of work has studied behavior of $\ell_1$-relaxations when applied to measurement matrices drawn from standard dense ensembles (e.g., Gaussian, Bernoulli). In this paper, we analyze \emph{sparsified} measurement ensembles, and consider the trade-off between measurement sparsity, as measured by the fraction $\gamma$ of non-zero entries, and the statistical efficiency, as measured by the minimal number of observations $n$ required for exact support recovery with probability converging to one. Our main result is to prove that it is possible to let $\gamma \to 0$ at some rate, yielding measurement matrices with a vanishing fraction of non-zeros per row while retaining the same statistical efficiency as dense ensembles. A variety of simulation results confirm the sharpness of our theoretical predictions.
Omidiran Dapo
Wainwright Martin J.
No associations
LandOfFree
High-dimensional subset recovery in noise: Sparsified measurements without loss of statistical efficiency does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with High-dimensional subset recovery in noise: Sparsified measurements without loss of statistical efficiency, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and High-dimensional subset recovery in noise: Sparsified measurements without loss of statistical efficiency will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-299898