Mathematics – Statistics Theory
Scientific paper
2007-08-14
Annals of Statistics 2007, Vol. 35, No. 2, 575-607
Mathematics
Statistics Theory
Published at http://dx.doi.org/10.1214/009053606000001226 in the Annals of Statistics (http://www.imstat.org/aos/) by the Inst
Scientific paper
10.1214/009053606000001226
For binary classification we establish learning rates up to the order of $n^{-1}$ for support vector machines (SVMs) with hinge loss and Gaussian RBF kernels. These rates are in terms of two assumptions on the considered distributions: Tsybakov's noise assumption to establish a small estimation error, and a new geometric noise condition which is used to bound the approximation error. Unlike previously proposed concepts for bounding the approximation error, the geometric noise assumption does not employ any smoothness assumption.
Scovel Clint
Steinwart Ingo
No associations
LandOfFree
Fast rates for support vector machines using Gaussian kernels does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Fast rates for support vector machines using Gaussian kernels, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Fast rates for support vector machines using Gaussian kernels will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-389284