The Common Information of N Dependent Random Variables

Computer Science – Information Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

8 pages, 4 figures,presented at the Forty-Eighth Annual Allerton Conference on Communication, Control, and Computing, Septembe

Scientific paper

This paper generalizes Wyner's definition of common information of a pair of random variables to that of $N$ random variables. We prove coding theorems that show the same operational meanings for the common information of two random variables generalize to that of $N$ random variables. As a byproduct of our proof, we show that the Gray-Wyner source coding network can be generalized to $N$ source squences with $N$ decoders. We also establish a monotone property of Wyner's common information which is in contrast to other notions of the common information, specifically Shannon's mutual information and G\'{a}cs and K\"{o}rner's common randomness. Examples about the computation of Wyner's common information of $N$ random variables are also given.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

The Common Information of N Dependent Random Variables does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with The Common Information of N Dependent Random Variables, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and The Common Information of N Dependent Random Variables will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-306122

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.