Computer Science – Information Theory
Scientific paper
2006-11-21
Computer Science
Information Theory
Scientific paper
We study how much memory one-pass compression algorithms need to compete with the best multi-pass algorithms. We call a one-pass algorithm an (f (n, \ell))-footprint compressor if, given $n$, $\ell$ and an $n$-ary string $S$, it stores $S$ in ((\rule{0ex}{2ex} O (H_\ell (S)) + o (\log n)) |S| + O (n^{\ell + 1} \log n)) bits -- where (H_\ell (S)) is the $\ell$th-order empirical entropy of $S$ -- while using at most (f (n, \ell)) bits of memory. We prove that, for any (\epsilon > 0) and some (f (n, \ell) \in O (n^{\ell + \epsilon} \log n)), there is an (f (n, \ell))-footprint compressor; on the other hand, there is no (f (n, \ell))-footprint compressor for (f (n, \ell) \in o (n^\ell \log n)).
No associations
LandOfFree
On the space complexity of one-pass compression does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with On the space complexity of one-pass compression, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On the space complexity of one-pass compression will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-338520