Computer Science – Information Theory
Scientific paper
2007-02-10
Computer Science
Information Theory
15 pages, 4 figures, incorporates and extends arXiv:cs/0605099 and arXiv:0809.1264v1
Scientific paper
This paper presents new lower and upper bounds for the compression rate of binary prefix codes optimized over memoryless sources according to various nonlinear codeword length objectives. Like the most well-known redundancy bounds for minimum average redundancy coding - Huffman coding - these are in terms of a form of entropy and/or the probability of an input symbol, often the most probable one. The bounds here, some of which are tight, improve on known bounds of the form L in [H,H+1), where H is some form of entropy in bits (or, in the case of redundancy objectives, 0) and L is the length objective, also in bits. The objectives explored here include exponential-average length, maximum pointwise redundancy, and exponential-average pointwise redundancy (also called dth exponential redundancy). The first of these relates to various problems involving queueing, uncertainty, and lossless communications; the second relates to problems involving Shannon coding and universal modeling. For these two objectives we also explore the related problem of the necessary and sufficient conditions for the shortest codeword of a code being a specific length.
No associations
LandOfFree
Redundancy-Related Bounds on Generalized Huffman Codes does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Redundancy-Related Bounds on Generalized Huffman Codes, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Redundancy-Related Bounds on Generalized Huffman Codes will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-443704