Computer Science – Information Theory
Scientific paper
2010-01-21
Computer Science
Information Theory
24 pages, 4 figures. Submitted to IEEE Trans. Info. Theory (Jan 2010)
Scientific paper
We provide a novel upper-bound on Witsenhausen's rate, the rate required in the zero-error analogue of the Slepian-Wolf problem; our bound is given in terms of a new information-theoretic functional defined on a certain graph. We then use the functional to give a single letter lower-bound on the error exponent for the Slepian-Wolf problem under the vanishing error probability criterion, where the decoder has full (i.e. unencoded) side information. Our exponent stems from our new encoding scheme which makes use of source distribution only through the positions of the zeros in the `channel' matrix connecting the source with the side information, and in this sense is `semi-universal'. We demonstrate that our error exponent can beat the `expurgated' source-coding exponent of Csisz\'{a}r and K\"{o}rner, achievability of which requires the use of a non-universal maximum-likelihood decoder. An extension of our scheme to the lossy case (i.e. Wyner-Ziv) is given. For the case when the side information is a deterministic function of the source, the exponent of our improved scheme agrees with the sphere-packing bound exactly (thus determining the reliability function). An application of our functional to zero-error channel capacity is also given.
Kelly Benjamin G.
Wagner Aaron B.
No associations
LandOfFree
Improved Source Coding Exponents via Witsenhausen's Rate does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Improved Source Coding Exponents via Witsenhausen's Rate, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Improved Source Coding Exponents via Witsenhausen's Rate will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-654999