Characterizing the Sample Complexity of Large-Margin Learning With Second-Order Statistics

Statistics – Machine Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

An extended and revised version of "Tight sample complexity of Large Margin Learning", NIPS 2010

Scientific paper

We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L_2 regularization: We introduce the margin-adapted dimension, which is a simple function of the second order statistics of the data distribution, and show distribution-specific upper and lower bounds on the sample complexity, both governed by the margin-adapted dimension of the data distribution. The upper bounds are universal, and the lower bounds hold for a rich family of sub-Gaussian distributions. We conclude that this new quantity tightly characterizes the true sample complexity of large-margin classification.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Characterizing the Sample Complexity of Large-Margin Learning With Second-Order Statistics does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Characterizing the Sample Complexity of Large-Margin Learning With Second-Order Statistics, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Characterizing the Sample Complexity of Large-Margin Learning With Second-Order Statistics will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-213148

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.