Computer Science – Learning
Scientific paper
2004-04-28
Proc. 17th Annual Conf. on Learning Theory (COLT-2004), pages 300--314
Computer Science
Learning
17 pages
Scientific paper
We study the properties of the Minimum Description Length principle for sequence prediction, considering a two-part MDL estimator which is chosen from a countable class of models. This applies in particular to the important case of universal sequence prediction, where the model class corresponds to all algorithms for some fixed universal Turing machine (this correspondence is by enumerable semimeasures, hence the resulting models are stochastic). We prove convergence theorems similar to Solomonoff's theorem of universal induction, which also holds for general Bayes mixtures. The bound characterizing the convergence speed for MDL predictions is exponentially larger as compared to Bayes mixtures. We observe that there are at least three different ways of using MDL for prediction. One of these has worse prediction properties, for which predictions only converge if the MDL estimator stabilizes. We establish sufficient conditions for this to occur. Finally, some immediate consequences for complexity relations and randomness criteria are proven.
Hutter Marcus
Poland Jan
No associations
LandOfFree
Convergence of Discrete MDL for Sequential Prediction does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Convergence of Discrete MDL for Sequential Prediction, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Convergence of Discrete MDL for Sequential Prediction will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-376704