Statistics – Machine Learning
Scientific paper
2010-10-04
Statistics
Machine Learning
Scientific paper
In nonparametric classification and regression problems, regularized kernel methods, in particular support vector machines, attract much attention in theoretical and in applied statistics. In an abstract sense, regularized kernel methods (simply called SVMs here) can be seen as regularized M-estimators for a parameter in a (typically infinite dimensional) reproducing kernel Hilbert space. For smooth loss functions, it is shown that the difference between the estimator, i.e.\ the empirical SVM, and the theoretical SVM is asymptotically normal with rate $\sqrt{n}$. That is, the standardized difference converges weakly to a Gaussian process in the reproducing kernel Hilbert space. As common in real applications, the choice of the regularization parameter may depend on the data. The proof is done by an application of the functional delta-method and by showing that the SVM-functional is suitably Hadamard-differentiable.
No associations
LandOfFree
Asymptotic Normality of Support Vector Machine Variants and Other Regularized Kernel Methods does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Asymptotic Normality of Support Vector Machine Variants and Other Regularized Kernel Methods, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Asymptotic Normality of Support Vector Machine Variants and Other Regularized Kernel Methods will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-274264