Statistics – Machine Learning
Scientific paper
2011-10-18
Statistics
Machine Learning
Scientific paper
LogitBoost and its later improvement, ABC-LogitBoost, are both successful multi-class boosting algorithms for classification. In this paper, we explicitly formulate the tree building at each LogitBoost iteration as constrained quadratic optimization. Both LogitBoost and ABC-LogtiBoost adopt approximated solver to such quadratic subproblem. We then propose an intuitively more natural solver, i.e. the block coordinate descent algorithm, and demonstrate that it leads to higher classification accuracy and faster convergence rate on a number of public datasets. This new LogitBoost behaves as if it combines many one-vs-one binary classifiers adaptively, hence the name AOSO-LogitBoost(Adaptive One-vs-One LogitBoost)
Sun Peng
Zhou Jie
No associations
LandOfFree
AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-290521