Mathematics – Statistics Theory
Scientific paper
2005-07-08
Mathematics
Statistics Theory
36 pages
Scientific paper
It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, i.e., the rates faster than $n^{-1/2}$. The works on this subject suggested the following two conjectures: (i) the best achievable fast rate is of the order $n^{-1}$, and (ii) the plug-in classifiers generally converge slower than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only the fast, but also the {\it super-fast} rates, i.e., the rates faster than $n^{-1}$. We establish minimax lower bounds showing that the obtained rates cannot be improved.
Audibert Jean-Yves
Tsybakov Alexandre B.
No associations
LandOfFree
Fast learning rates for plug-in classifiers under the margin condition does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Fast learning rates for plug-in classifiers under the margin condition, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Fast learning rates for plug-in classifiers under the margin condition will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-375076