Fast rates for support vector machines using Gaussian kernels

Mathematics – Statistics Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Published at http://dx.doi.org/10.1214/009053606000001226 in the Annals of Statistics (http://www.imstat.org/aos/) by the Inst

Scientific paper

10.1214/009053606000001226

For binary classification we establish learning rates up to the order of $n^{-1}$ for support vector machines (SVMs) with hinge loss and Gaussian RBF kernels. These rates are in terms of two assumptions on the considered distributions: Tsybakov's noise assumption to establish a small estimation error, and a new geometric noise condition which is used to bound the approximation error. Unlike previously proposed concepts for bounding the approximation error, the geometric noise assumption does not employ any smoothness assumption.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Fast rates for support vector machines using Gaussian kernels does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Fast rates for support vector machines using Gaussian kernels, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Fast rates for support vector machines using Gaussian kernels will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-389284

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.