On Measure Theoretic definitions of Generalized Information Measures and Maximum Entropy Prescriptions

Computer Science – Information Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Though Shannon entropy of a probability measure $P$, defined as $- \int_{X} \frac{\ud P}{\ud \mu} \ln \frac{\ud P}{\ud\mu} \ud \mu$ on a measure space $(X, \mathfrak{M},\mu)$, does not qualify itself as an information measure (it is not a natural extension of the discrete case), maximum entropy (ME) prescriptions in the measure-theoretic case are consistent with that of discrete case. In this paper, we study the measure-theoretic definitions of generalized information measures and discuss the ME prescriptions. We present two results in this regard: (i) we prove that, as in the case of classical relative-entropy, the measure-theoretic definitions of generalized relative-entropies, R\'{e}nyi and Tsallis, are natural extensions of their respective discrete cases, (ii) we show that, ME prescriptions of measure-theoretic Tsallis entropy are consistent with the discrete case.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

On Measure Theoretic definitions of Generalized Information Measures and Maximum Entropy Prescriptions does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with On Measure Theoretic definitions of Generalized Information Measures and Maximum Entropy Prescriptions, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On Measure Theoretic definitions of Generalized Information Measures and Maximum Entropy Prescriptions will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-449310

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.