Computer Science – Learning
Scientific paper
2011-06-29
Computer Science
Learning
The primary result is accepted to NIPS 2010 for Oral Presentation
Scientific paper
Sparse linear regression -- finding an unknown vector from linear measurements -- is now known to be possible with fewer samples than variables, via methods like the LASSO. We consider the multiple sparse linear regression problem, where several related vectors -- with partially shared support sets -- have to be recovered. A natural question in this setting is whether one can use the sharing to further decrease the overall number of samples required. A line of recent research has studied the use of \ell_1/\ell_q norm block-regularizations with q>1 for such problems; however these could actually perform worse in sample complexity -- vis a vis solving each problem separately ignoring sharing -- depending on the level of sharing. We present a new method for multiple sparse linear regression that can leverage support and parameter overlap when it exists, but not pay a penalty when it does not. A very simple idea: we decompose the parameters into two components and regularize these differently. We show both theoretically and empirically, our method strictly and noticeably outperforms both \ell_1 or \ell_1/\ell_q methods, over the entire range of possible overlaps (except at boundary cases, where we match the best method). We also provide theoretical guarantees that the method performs well under high-dimensional scaling.
Jalali Ali
Ravikumar Pradeep
Sanghavi Sujay
No associations
LandOfFree
A Dirty Model for Multiple Sparse Regression does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with A Dirty Model for Multiple Sparse Regression, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A Dirty Model for Multiple Sparse Regression will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-358735