Computer Science – Learning
Scientific paper
2011-06-26
Computer Science
Learning
Scientific paper
We study a generalized framework for structured sparsity. It extends the well-known methods of Lasso and Group Lasso by incorporating additional constraints on the variables as part of a convex optimization problem. This framework provides a straightforward way of favouring prescribed sparsity patterns, such as orderings, contiguous regions and overlapping groups, among others. Existing optimization methods are limited to specific constraint sets and tend to not scale well with sample size and dimensionality. We propose a novel first order proximal method, which builds upon results on fixed points and successive approximations. The algorithm can be applied to a general class of conic and norm constraints sets and relies on a proximity operator subproblem which can be computed explicitly. Experiments on different regression problems demonstrate the efficiency of the optimization algorithm and its scalability with the size of the problem. They also demonstrate state of the art statistical performance, which improves over Lasso and StructOMP.
Argyriou Andreas
Baldassarre Luca
Morales Jean
Pontil Massimiliano
No associations
LandOfFree
A General Framework for Structured Sparsity via Proximal Optimization does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with A General Framework for Structured Sparsity via Proximal Optimization, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A General Framework for Structured Sparsity via Proximal Optimization will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-586041