Statistics – Methodology
Scientific paper
2011-11-18
Statistics
Methodology
Scientific paper
Recent work has focused on the problem of conducting linear regression when the number of covariates is very large, potentially greater than the sample size. To facilitate this, one useful tool is to assume that the model can be well approximated by a fit involving only a small number of covariates -- a so called sparsity assumption, which leads to the Lasso and other methods. In many situations, however, the covariates can be considered to be structured, in that the selection of some variables favours the selection of others -- with variables organised into groups entering or leaving the model simultaneously as a special case. This structure creates a different form of sparsity. In this paper, we suggest the Co-adaptive Lasso to fit models accommodating this form of `group sparsity'. The Co-adaptive Lasso is fast and simple to calculate, and we show that it holds theoretical advantages over the Lasso, performs well under a broad set of conclusions, and is very competitive in empirical simulations in comparison with previously suggested algorithms like the Group Lasso and the Adaptive Lasso.
No associations
LandOfFree
Sparse Group Selection Through Co-Adaptive Penalties does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Sparse Group Selection Through Co-Adaptive Penalties, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Sparse Group Selection Through Co-Adaptive Penalties will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-549777