Model-Consistent Sparse Estimation through the Bootstrap

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

We consider the least-square linear regression problem with regularization by the $\ell^1$-norm, a problem usually referred to as the Lasso. In this paper, we first present a detailed asymptotic analysis of model consistency of the Lasso in low-dimensional settings. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection. For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection procedure, referred to as the Bolasso, is extended to high-dimensional settings by a provably consistent two-step procedure.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Model-Consistent Sparse Estimation through the Bootstrap does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Model-Consistent Sparse Estimation through the Bootstrap, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Model-Consistent Sparse Estimation through the Bootstrap will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-684169

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.