Mathematics – Statistics Theory
Scientific paper
2011-08-18
Mathematics
Statistics Theory
Some technical errors are spotted
Scientific paper
The Lasso is one of the most important approaches for parameter estimation and variable selection in high dimensional linear regression. At the heart of its success is the attractive rate of convergence result even when $p$, the dimension of the problem, is much larger than the sample size $n$. In particular, Bickel et al. (2009) showed that this rate, in terms of the $\ell_1$ norm, is of the order $s\sqrt{(\log p)/n}$ for a sparsity index $s$. In this paper, we obtain a new bound on the convergence rate by taking advantage of the distributional information of the model. Under the normality or sub-Gaussian assumption, the rate can be improved to nearly $s/\sqrt{n}$ for certain design matrices. We further outline a general partitioning technique that helps to derive sharper convergence rate for the Lasso. The result is applicable to many covariance matrices suitable for high-dimensional data analysis.
Leng Chenlei
Zhao Junlong
No associations
LandOfFree
New Error Analysis for Lasso does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with New Error Analysis for Lasso, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and New Error Analysis for Lasso will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-181402