Computer Science – Learning
Scientific paper
2010-11-23
S. Sabato, N. Srebro and N. Tishby, Tight Sample Complexity of Large-Margin Learning, Advances in Neural Information Processin
Computer Science
Learning
Appearing in Neural Information Processing Systems (NIPS) 2010; This is the full version, including appendix with proofs; Also
Scientific paper
We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L_2 regularization: We introduce the \gamma-adapted-dimension, which is a simple function of the spectrum of a distribution's covariance matrix, and show distribution-specific upper and lower bounds on the sample complexity, both governed by the \gamma-adapted-dimension of the source distribution. We conclude that this new quantity tightly characterizes the true sample complexity of large-margin classification. The bounds hold for a rich family of sub-Gaussian distributions.
Sabato Sivan
Srebro Nathan
Tishby Naftali
No associations
LandOfFree
Tight Sample Complexity of Large-Margin Learning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Tight Sample Complexity of Large-Margin Learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Tight Sample Complexity of Large-Margin Learning will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-587859