Statistics – Machine Learning
Scientific paper
2008-03-28
Annals of Statistics 2007, Vol. 35, No. 6, 2723-2768
Statistics
Machine Learning
Published in at http://dx.doi.org/10.1214/009053607000000785 the Annals of Statistics (http://www.imstat.org/aos/) by the Inst
Scientific paper
10.1214/009053607000000785
We introduce a useful tool for analyzing boosting algorithms called the ``smooth margin function,'' a differentiable approximation of the usual margin for boosting algorithms. We present two boosting algorithms based on this smooth margin, ``coordinate ascent boosting'' and ``approximate coordinate ascent boosting,'' which are similar to Freund and Schapire's AdaBoost algorithm and Breiman's arc-gv algorithm. We give convergence rates to the maximum margin solution for both of our algorithms and for arc-gv. We then study AdaBoost's convergence properties using the smooth margin function. We precisely bound the margin attained by AdaBoost when the edges of the weak classifiers fall within a specified range. This shows that a previous bound proved by R\"{a}tsch and Warmuth is exactly tight. Furthermore, we use the smooth margin to capture explicit properties of AdaBoost in cases where cyclic behavior occurs.
Daubechies Ingrid
Rudin Cynthia
Schapire Robert E.
No associations
LandOfFree
Analysis of boosting algorithms using the smooth margin function does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Analysis of boosting algorithms using the smooth margin function, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Analysis of boosting algorithms using the smooth margin function will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-173844