An Empirical Study of MDL Model Selection with Infinite Parametric Complexity

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

23 pages, 11 graphs

Scientific paper

Parametric complexity is a central concept in MDL model selection. In practice it often turns out to be infinite, even for quite simple models such as the Poisson and Geometric families. In such cases, MDL model selection as based on NML and Bayesian inference based on Jeffreys' prior can not be used. Several ways to resolve this problem have been proposed. We conduct experiments to compare and evaluate their behaviour on small sample sizes. We find interestingly poor behaviour for the plug-in predictive code; a restricted NML model performs quite well but it is questionable if the results validate its theoretical motivation. The Bayesian model with the improper Jeffreys' prior is the most dependable.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

An Empirical Study of MDL Model Selection with Infinite Parametric Complexity does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with An Empirical Study of MDL Model Selection with Infinite Parametric Complexity, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and An Empirical Study of MDL Model Selection with Infinite Parametric Complexity will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-87744

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.