Relative Entropy and Statistics

Computer Science – Information Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

31 pages. 2 figures.

Scientific paper

Formalising the confrontation of opinions (models) to observations (data) is the task of Inferential Statistics. Information Theory provides us with a basic functional, the relative entropy (or Kullback-Leibler divergence), an asymmetrical measure of dissimilarity between the empirical and the theoretical distributions. The formal properties of the relative entropy turn out to be able to capture every aspect of Inferential Statistics, as illustrated here, for simplicity, on dices (= i.i.d. process with finitely many outcomes): refutability (strict or probabilistic): the asymmetry data / models; small deviations: rejecting a single hypothesis; competition between hypotheses and model selection; maximum likelihood: model inference and its limits; maximum entropy: reconstructing partially observed data; EM-algorithm; flow data and gravity modelling; determining the order of a Markov chain.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Relative Entropy and Statistics does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Relative Entropy and Statistics, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Relative Entropy and Statistics will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-543493

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.