Computer Science – Information Theory
Scientific paper
2007-08-14
Computer Science
Information Theory
Scientific paper
Let $s$ be a string of length $n$ over an alphabet of constant size $\sigma$ and let $c$ and $\epsilon$ be constants with (1 \geq c \geq 0) and (\epsilon > 0). Using (O (n)) time, (O (n^c)) bits of memory and one pass we can always encode $s$ in (n H_k (s) + O (\sigma^k n^{1 - c + \epsilon})) bits for all integers (k \geq 0) simultaneously. On the other hand, even with unlimited time, using (O (n^c)) bits of memory and one pass we cannot always encode $s$ in (O (n H_k (s) + \sigma^k n^{1 - c - \epsilon})) bits for, e.g., (k = \lceil (c + \epsilon / 2) \log_\sigma n \rceil).
No associations
LandOfFree
A nearly tight memory-redundancy trade-off for one-pass compression does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with A nearly tight memory-redundancy trade-off for one-pass compression, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A nearly tight memory-redundancy trade-off for one-pass compression will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-389522