Computer Science – Information Theory
Scientific paper
2010-09-20
Computer Science
Information Theory
to be presented at DCC 2011
Scientific paper
Many communication channels with discrete input have non-uniform capacity achieving probability mass functions (PMF). By parsing a stream of independent and equiprobable bits according to a full prefix-free code, a modu-lator can generate dyadic PMFs at the channel input. In this work, we show that for discrete memoryless channels and for memoryless discrete noiseless channels, searching for good dyadic input PMFs is equivalent to minimizing the Kullback-Leibler distance between a dyadic PMF and a weighted version of the capacity achieving PMF. We define a new algorithm called Geometric Huffman Coding (GHC) and prove that GHC finds the optimal dyadic PMF in O(m \log m) steps where m is the number of input symbols of the considered channel. Furthermore, we prove that by generating dyadic PMFs of blocks of consecutive input symbols, GHC achieves capacity when the block length goes to infinity.
Böcherer Georg
Mathar Rudolf
No associations
LandOfFree
Matching Dyadic Distributions to Channels does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Matching Dyadic Distributions to Channels, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Matching Dyadic Distributions to Channels will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-264108