Mathematics – Statistics Theory
Scientific paper
2010-08-16
Mathematics
Statistics Theory
39 pages, 5 figures
Scientific paper
We consider the problem of learning a coefficient vector $x_0\in\reals^N$ from noisy linear observation $y=Ax_0+w\in\reals^n$. In many contexts (ranging from model selection to image processing) it is desirable to construct a sparse estimator $\hx$. In this case, a popular approach consists in solving an $\ell_1$-penalized least squares problem known as the LASSO or Basis Pursuit DeNoising (BPDN). For sequences of matrices $A$ of increasing dimensions, with independent gaussian entries, we prove that the normalized risk of the LASSO converges to a limit, and we obtain an explicit expression for this limit. Our result is the first rigorous derivation of an explicit formula for the asymptotic mean square error of the LASSO for random instances. The proof technique is based on the analysis of AMP, a recently developed efficient algorithm, that is inspired from graphical models ideas.
Bayati Mohsen
Montanari Andrea
No associations
LandOfFree
The LASSO risk for gaussian matrices does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with The LASSO risk for gaussian matrices, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and The LASSO risk for gaussian matrices will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-176972