Computer Science – Learning
Scientific paper
2011-01-25
Journal of Machine Learning Research, 13:561-606, 2012
Computer Science
Learning
40 pages, 8 figures; the NIPS 2011 submission "The Fast Convergence of Boosting" is a brief presentation of the primary result
Scientific paper
Boosting combines weak learners into a predictor with low empirical risk. Its dual constructs a high entropy distribution upon which weak learners and training labels are uncorrelated. This manuscript studies this primal-dual relationship under a broad family of losses, including the exponential loss of AdaBoost and the logistic loss, revealing: - Weak learnability aids the whole loss family: for any {\epsilon}>0, O(ln(1/{\epsilon})) iterations suffice to produce a predictor with empirical risk {\epsilon}-close to the infimum; - The circumstances granting the existence of an empirical risk minimizer may be characterized in terms of the primal and dual problems, yielding a new proof of the known rate O(ln(1/{\epsilon})); - Arbitrary instances may be decomposed into the above two, granting rate O(1/{\epsilon}), with a matching lower bound provided for the logistic loss.
No associations
LandOfFree
A Primal-Dual Convergence Analysis of Boosting does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with A Primal-Dual Convergence Analysis of Boosting, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A Primal-Dual Convergence Analysis of Boosting will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-542259