On the Convergence Speed of MDL Predictions for Bernoulli Sequences

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

17 pages

Scientific paper

We consider the Minimum Description Length principle for online sequence prediction. If the underlying model class is discrete, then the total expected square loss is a particularly interesting performance measure: (a) this quantity is bounded, implying convergence with probability one, and (b) it additionally specifies a `rate of convergence'. Generally, for MDL only exponential loss bounds hold, as opposed to the linear bounds for a Bayes mixture. We show that this is even the case if the model class contains only Bernoulli distributions. We derive a new upper bound on the prediction error for countable Bernoulli classes. This implies a small bound (comparable to the one for Bayes mixtures) for certain important model classes. The results apply to many Machine Learning tasks including classification and hypothesis testing. We provide arguments that our theorems generalize to countable classes of i.i.d. models.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

On the Convergence Speed of MDL Predictions for Bernoulli Sequences does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with On the Convergence Speed of MDL Predictions for Bernoulli Sequences, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On the Convergence Speed of MDL Predictions for Bernoulli Sequences will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-636282

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.