Computer Science – Information Theory
Scientific paper
2005-11-02
Computer Science
Information Theory
35 pages, 37 references, no figures. Submitted to IEEE Transactions on Information Theory
Scientific paper
We introduce a universal quantization scheme based on random coding, and we analyze its performance. This scheme consists of a source-independent random codebook (typically_mismatched_ to the source distribution), followed by optimal entropy-coding that is_matched_ to the quantized codeword distribution. A single-letter formula is derived for the rate achieved by this scheme at a given distortion, in the limit of large codebook dimension. The rate reduction due to entropy-coding is quantified, and it is shown that it can be arbitrarily large. In the special case of "almost uniform" codebooks (e.g., an i.i.d. Gaussian codebook with large variance) and difference distortion measures, a novel connection is drawn between the compression achieved by the present scheme and the performance of "universal" entropy-coded dithered lattice quantizers. This connection generalizes the "half-a-bit" bound on the redundancy of dithered lattice quantizers. Moreover, it demonstrates a strong notion of universality where a single "almost uniform" codebook is near-optimal for_any_ source and_any_ difference distortion measure.
Kontoyiannis Ioannis
Zamir Rami
No associations
LandOfFree
Mismatched codebooks and the role of entropy-coding in lossy data compression does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Mismatched codebooks and the role of entropy-coding in lossy data compression, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Mismatched codebooks and the role of entropy-coding in lossy data compression will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-639406