Fast learning rates for plug-in classifiers

Mathematics – Statistics Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Published at http://dx.doi.org/10.1214/009053606000001217 in the Annals of Statistics (http://www.imstat.org/aos/) by the Inst

Scientific paper

10.1214/009053606000001217

It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than $n^{-1/2}$. The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order $n^{-1}$, and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than $n^{-1}$. We establish minimax lower bounds showing that the obtained rates cannot be improved.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Fast learning rates for plug-in classifiers does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Fast learning rates for plug-in classifiers, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Fast learning rates for plug-in classifiers will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-5744

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.