Computer Science – Information Theory
Scientific paper
2005-08-11
Computer Science
Information Theory
6 pages, To appear in the proceedings of the 2005 IEEE International Symposium on Information Theory, Adelaide, Australia, Sep
Scientific paper
This paper describes a new set of block source codes well suited for data compression. These codes are defined by sets of productions rules of the form a.l->b, where a in A represents a value from the source alphabet A and l, b are -small- sequences of bits. These codes naturally encompass other Variable Length Codes (VLCs) such as Huffman codes. It is shown that these codes may have a similar or even a shorter mean description length than Huffman codes for the same encoding and decoding complexity. A first code design method allowing to preserve the lexicographic order in the bit domain is described. The corresponding codes have the same mean description length (mdl) as Huffman codes from which they are constructed. Therefore, they outperform from a compression point of view the Hu-Tucker codes designed to offer the lexicographic property in the bit domain. A second construction method allows to obtain codes such that the marginal bit probability converges to 0.5 as the sequence length increases and this is achieved even if the probability distribution function is not known by the encoder.
Guillemot Christine
Jégou Hervé
No associations
LandOfFree
Entropy coding with Variable Length Re-writing Systems does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Entropy coding with Variable Length Re-writing Systems, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Entropy coding with Variable Length Re-writing Systems will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-277454