Computer Science – Information Theory
Scientific paper
2010-10-18
Computer Science
Information Theory
8 pages, 4 figures,presented at the Forty-Eighth Annual Allerton Conference on Communication, Control, and Computing, Septembe
Scientific paper
This paper generalizes Wyner's definition of common information of a pair of random variables to that of $N$ random variables. We prove coding theorems that show the same operational meanings for the common information of two random variables generalize to that of $N$ random variables. As a byproduct of our proof, we show that the Gray-Wyner source coding network can be generalized to $N$ source squences with $N$ decoders. We also establish a monotone property of Wyner's common information which is in contrast to other notions of the common information, specifically Shannon's mutual information and G\'{a}cs and K\"{o}rner's common randomness. Examples about the computation of Wyner's common information of $N$ random variables are also given.
Chen Biao
Liu Wei
Xu Ge
No associations
LandOfFree
The Common Information of N Dependent Random Variables does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with The Common Information of N Dependent Random Variables, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and The Common Information of N Dependent Random Variables will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-306122