Computer Science – Information Theory
Scientific paper
2008-12-31
IEEE Transactions on Information Theory, Vol. 56(6), pp. 2699-2713, June 2010
Computer Science
Information Theory
15 pages, 1 figure. Originally submitted to the IEEE Transactions on Information Theory in May 2007, the current version incor
Scientific paper
Upper and lower bounds are obtained for the joint entropy of a collection of random variables in terms of an arbitrary collection of subset joint entropies. These inequalities generalize Shannon's chain rule for entropy as well as inequalities of Han, Fujishige and Shearer. A duality between the upper and lower bounds for joint entropy is developed. All of these results are shown to be special cases of general, new results for submodular functions-- thus, the inequalities presented constitute a richly structured class of Shannon-type inequalities. The new inequalities are applied to obtain new results in combinatorics, such as bounds on the number of independent sets in an arbitrary graph and the number of zero-error source-channel codes, as well as new determinantal inequalities in matrix theory. A new inequality for relative entropies is also developed, along with interpretations in terms of hypothesis testing. Finally, revealing connections of the results to literature in economics, computer science, and physics are explored.
Madiman Mokshay
Tetali Prasad
No associations
LandOfFree
Information Inequalities for Joint Distributions, with Interpretations and Applications does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Information Inequalities for Joint Distributions, with Interpretations and Applications, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Information Inequalities for Joint Distributions, with Interpretations and Applications will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-536385