Falsification and future performance

Statistics – Machine Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

10 pages, 2 figures

Scientific paper

We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corollary we show that empirical VC-entropy quantifies the message length of the true hypothesis in the optimal code of a particular probability distribution, the so-called actual repertoire.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Falsification and future performance does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Falsification and future performance, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Falsification and future performance will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-156361

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.