Mathematics – Statistics Theory
Scientific paper
2008-04-18
Bernoulli 17, 2 (2011) 687-713
Mathematics
Statistics Theory
Published in at http://dx.doi.org/10.3150/10-BEJ288 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statisti
Scientific paper
10.3150/10-BEJ288
A classical condition for fast learning rates is the margin condition, first introduced by Mammen and Tsybakov. We tackle in this paper the problem of adaptivity to this condition in the context of model selection, in a general learning framework. Actually, we consider a weaker version of this condition that allows one to take into account that learning within a small model can be much easier than within a large one. Requiring this "strong margin adaptivity" makes the model selection problem more challenging. We first prove, in a general framework, that some penalization procedures (including local Rademacher complexities) exhibit this adaptivity when the models are nested. Contrary to previous results, this holds with penalties that only depend on the data. Our second main result is that strong margin adaptivity is not always possible when the models are not nested: for every model selection procedure (even a randomized one), there is a problem for which it does not demonstrate strong margin adaptivity.
Arlot Sylvain
Bartlett Peter L.
No associations
LandOfFree
Margin-adaptive model selection in statistical learning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Margin-adaptive model selection in statistical learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Margin-adaptive model selection in statistical learning will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-303800