Computer Science – Learning
Scientific paper
2009-03-27
Computer Science
Learning
26 Pages, 5 Figures
Scientific paper
For a variety of regularized optimization problems in machine learning, algorithms computing the entire solution path have been developed recently. Most of these methods are quadratic programs that are parameterized by a single parameter, as for example the Support Vector Machine (SVM). Solution path algorithms do not only compute the solution for one particular value of the regularization parameter but the entire path of solutions, making the selection of an optimal parameter much easier. It has been assumed that these piecewise linear solution paths have only linear complexity, i.e. linearly many bends. We prove that for the support vector machine this complexity can be exponential in the number of training points in the worst case. More strongly, we construct a single instance of n input points in d dimensions for an SVM such that at least {\Theta}(2^n/2) = {\Theta}(2^d) many distinct subsets of support vectors occur as the regularization parameter changes.
Gärtner Bernd
Jaggi Martin
Maria Clément
No associations
LandOfFree
An Exponential Lower Bound on the Complexity of Regularization Paths does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with An Exponential Lower Bound on the Complexity of Regularization Paths, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and An Exponential Lower Bound on the Complexity of Regularization Paths will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-21619