Algorithmic Complexity Bounds on Future Prediction Errors

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

21 pages

Scientific paper

10.1016/j.ic.2006.10.004

We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor $M$ from the true distribution $mu$ by the algorithmic complexity of $mu$. Here we assume we are at a time $t>1$ and already observed $x=x_1...x_t$. We bound the future prediction performance on $x_{t+1}x_{t+2}...$ by a new variant of algorithmic complexity of $mu$ given $x$, plus the complexity of the randomness deficiency of $x$. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Algorithmic Complexity Bounds on Future Prediction Errors does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Algorithmic Complexity Bounds on Future Prediction Errors, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Algorithmic Complexity Bounds on Future Prediction Errors will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-70541

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.