Statistics – Machine Learning
Scientific paper
2012-03-20
Statistics
Machine Learning
Scientific paper
Regularized kernel methods such as, e.g., support vector machines and least-squares support vector regression constitute an important class of standard learning algorithms in machine learning. Theoretical investigations concerning asymptotic properties have manly focused on rates of convergence during the last years but there are only very few and limited (asymptotic) results on statistical inference so far. As this is a serious limitation for their use in mathematical statistics, the goal of the article is to fill this gap. Based on asymptotic normality of many of these methods, the article derives a strongly consistent estimator for the unknown covariance matrix of the limiting normal distribution. In this way, we obtain asymptotically correct confidence sets for $\psi(f_{P,\lambda_0})$ where $f_{P,\lambda_0}$ denotes the minimizer of the regularized risk in the reproducing kernel Hilbert space $H$ and $\psi:H\rightarrow\mathds{R}^m$ is any Hadamard-differentiable functional. Applications include (multivariate) pointwise confidence sets for values of $f_{P,\lambda_0}$ and confidence sets for gradients, integrals, and norms.
No associations
LandOfFree
Asymptotic Confidence Sets for General Nonparametric Regression and Classification by Regularized Kernel Methods does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Asymptotic Confidence Sets for General Nonparametric Regression and Classification by Regularized Kernel Methods, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Asymptotic Confidence Sets for General Nonparametric Regression and Classification by Regularized Kernel Methods will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-494370