Computer Science – Information Theory
Scientific paper
2006-01-18
Computer Science
Information Theory
Scientific paper
Though Shannon entropy of a probability measure $P$, defined as $- \int_{X} \frac{\ud P}{\ud \mu} \ln \frac{\ud P}{\ud\mu} \ud \mu$ on a measure space $(X, \mathfrak{M},\mu)$, does not qualify itself as an information measure (it is not a natural extension of the discrete case), maximum entropy (ME) prescriptions in the measure-theoretic case are consistent with that of discrete case. In this paper, we study the measure-theoretic definitions of generalized information measures and discuss the ME prescriptions. We present two results in this regard: (i) we prove that, as in the case of classical relative-entropy, the measure-theoretic definitions of generalized relative-entropies, R\'{e}nyi and Tsallis, are natural extensions of their respective discrete cases, (ii) we show that, ME prescriptions of measure-theoretic Tsallis entropy are consistent with the discrete case.
Bhatnagar Shalabh
Dukkipati Ambedkar
Murty Narasimha M.
No associations
LandOfFree
On Measure Theoretic definitions of Generalized Information Measures and Maximum Entropy Prescriptions does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with On Measure Theoretic definitions of Generalized Information Measures and Maximum Entropy Prescriptions, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On Measure Theoretic definitions of Generalized Information Measures and Maximum Entropy Prescriptions will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-449310