AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem

Statistics – Machine Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

LogitBoost and its later improvement, ABC-LogitBoost, are both successful multi-class boosting algorithms for classification. In this paper, we explicitly formulate the tree building at each LogitBoost iteration as constrained quadratic optimization. Both LogitBoost and ABC-LogtiBoost adopt approximated solver to such quadratic subproblem. We then propose an intuitively more natural solver, i.e. the block coordinate descent algorithm, and demonstrate that it leads to higher classification accuracy and faster convergence rate on a number of public datasets. This new LogitBoost behaves as if it combines many one-vs-one binary classifiers adaptively, hence the name AOSO-LogitBoost(Adaptive One-vs-One LogitBoost)

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-290521

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.