Mathematics – Statistics Theory
Scientific paper
2007-02-23
Annals of Statistics 2006, Vol. 34, No. 5, 2326-2366
Mathematics
Statistics Theory
Published at http://dx.doi.org/10.1214/009053606000000786 in the Annals of Statistics (http://www.imstat.org/aos/) by the Inst
Scientific paper
10.1214/009053606000000786
We propose a general theorem providing upper bounds for the risk of an empirical risk minimizer (ERM).We essentially focus on the binary classification framework. We extend Tsybakov's analysis of the risk of an ERM under margin type conditions by using concentration inequalities for conveniently weighted empirical processes. This allows us to deal with ways of measuring the ``size'' of a class of classifiers other than entropy with bracketing as in Tsybakov's work. In particular, we derive new risk bounds for the ERM when the classification rules belong to some VC-class under margin conditions and discuss the optimality of these bounds in a minimax sense.
Massart Pascal
Nédélec Élodie
No associations
LandOfFree
Risk bounds for statistical learning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Risk bounds for statistical learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Risk bounds for statistical learning will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-692035