Computer Science – Information Theory
Scientific paper
2011-06-09
Entropy 2011, 13(11), 1945-1957
Computer Science
Information Theory
11 pages LaTeX, minor revision
Scientific paper
10.3390/e13111945
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the `information loss', or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.
Baez John C.
Fritz Tobias
Leinster Tom
No associations
LandOfFree
A Characterization of Entropy in Terms of Information Loss does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with A Characterization of Entropy in Terms of Information Loss, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A Characterization of Entropy in Terms of Information Loss will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-580235