Kernels for Vector-Valued Functions: a Review

Statistics – Machine Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a Bayesian/generative perspective they are the key in the context of Gaussian processes, where the kernel function is also known as the covariance function. Traditionally, kernel methods have been used in supervised learning problem with scalar outputs and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partly by frameworks like multitask learning. In this paper, we review different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Kernels for Vector-Valued Functions: a Review does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Kernels for Vector-Valued Functions: a Review, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Kernels for Vector-Valued Functions: a Review will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-35353

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.