Computer Science – Information Theory
Scientific paper
2008-12-17
Computer Science
Information Theory
Scientific paper
A common complaint about adaptive prefix coding is that it is much slower than static prefix coding. Karpinski and Nekrich recently took an important step towards resolving this: they gave an adaptive Shannon coding algorithm that encodes each character in (O (1)) amortized time and decodes it in (O (\log H)) amortized time, where $H$ is the empirical entropy of the input string $s$. For comparison, Gagie's adaptive Shannon coder and both Knuth's and Vitter's adaptive Huffman coders all use (\Theta (H)) amortized time for each character. In this paper we give an adaptive Shannon coder that both encodes and decodes each character in (O (1)) worst-case time. As with both previous adaptive Shannon coders, we store $s$ in at most ((H + 1) |s| + o (|s|)) bits. We also show that this encoding length is worst-case optimal up to the lower order term.
Gagie Travis
Nekrich Yakov
No associations
LandOfFree
Worst-Case Optimal Adaptive Prefix Coding does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Worst-Case Optimal Adaptive Prefix Coding, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Worst-Case Optimal Adaptive Prefix Coding will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-95179