Statistics – Machine Learning
Scientific paper
2011-04-24
Statistics
Machine Learning
Scientific paper
Scaled sparse linear regression jointly estimates the regression coefficients and noise level in a linear model. It chooses an equilibrium with a sparse regression method by iteratively estimating the noise level via the mean residual squares and scaling the penalty in proportion to the estimated noise level. The iterative algorithm costs nearly nothing beyond the computation of a path of the sparse regression estimator for penalty levels above a threshold. For the scaled Lasso, the algorithm is a gradient descent in a convex minimization of a penalized joint loss function for the regression coefficients and noise level. Under mild regularity conditions, we prove that the method yields simultaneously an estimator for the noise level and an estimated coefficient vector in the Lasso path satisfying certain oracle inequalities for the estimation of the noise level, prediction, and the estimation of regression coefficients. These oracle inequalities provide sufficient conditions for the consistency and asymptotic normality of the estimator for the noise level, including cases where the number of variables is of greater order than the sample size. Numerical results demonstrate the superior performance of the proposed method over an earlier proposal of joint convex minimization.
Sun Tingni
Zhang Cun-Hui
No associations
LandOfFree
Scaled Sparse Linear Regression does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Scaled Sparse Linear Regression, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Scaled Sparse Linear Regression will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-623746