On-line regression competitive with reproducing kernel Hilbert spaces

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

37 pages, 1 figure

Scientific paper

We consider the problem of on-line prediction of real-valued labels, assumed bounded in absolute value by a known constant, of new objects from known labeled objects. The prediction algorithm's performance is measured by the squared deviation of the predictions from the actual labels. No stochastic assumptions are made about the way the labels and objects are generated. Instead, we are given a benchmark class of prediction rules some of which are hoped to produce good predictions. We show that for a wide range of infinite-dimensional benchmark classes one can construct a prediction algorithm whose cumulative loss over the first N examples does not exceed the cumulative loss of any prediction rule in the class plus O(sqrt(N)); the main differences from the known results are that we do not impose any upper bound on the norm of the considered prediction rules and that we achieve an optimal leading term in the excess loss of our algorithm. If the benchmark class is "universal" (dense in the class of continuous functions on each compact set), this provides an on-line non-stochastic analogue of universally consistent prediction in non-parametric statistics. We use two proof techniques: one is based on the Aggregating Algorithm and the other on the recently developed method of defensive forecasting.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

On-line regression competitive with reproducing kernel Hilbert spaces does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with On-line regression competitive with reproducing kernel Hilbert spaces, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On-line regression competitive with reproducing kernel Hilbert spaces will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-7485

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.