Physics – Condensed Matter – Disordered Systems and Neural Networks
Scientific paper
2005-02-21
Physics
Condensed Matter
Disordered Systems and Neural Networks
Revised version in press in Neural Computation
Scientific paper
We consider kernel based learning methods for regression and analyze what happens to the risk minimizer when new variables, statistically independent of input and target variables, are added to the set of input variables; this problem arises, for example, in the detection of causality relations between two time series. We find that the risk minimizer remains unchanged if we constrain the risk minimization to hypothesis spaces induced by suitable kernel functions. We show that not all kernel induced hypothesis spaces enjoy this property. We present sufficient conditions ensuring that the risk minimizer does not change, and show that they hold for inhomogeneous polynomial and Gaussian RBF kernels. We also provide examples of kernel induced hypothesis spaces whose risk minimizer changes if independent variables are added as input.
Ancona Nicola
Stramaglia Sebastiano
No associations
LandOfFree
An invariance property of kernel based predictors does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with An invariance property of kernel based predictors, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and An invariance property of kernel based predictors will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-197441