Grammar-Based Compression in a Streaming Model

Computer Science – Data Structures and Algorithms

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Section on recent work added, sketching how to improve bounds and support random access

Scientific paper

We show that, given a string $s$ of length $n$, with constant memory and logarithmic passes over a constant number of streams we can build a context-free grammar that generates $s$ and only $s$ and whose size is within an $\Oh{\min (g \log g, \sqrt{n \log g})}$-factor of the minimum $g$. This stands in contrast to our previous result that, with polylogarithmic memory and polylogarithmic passes over a single stream, we cannot build such a grammar whose size is within any polynomial of $g$.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Grammar-Based Compression in a Streaming Model does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Grammar-Based Compression in a Streaming Model, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Grammar-Based Compression in a Streaming Model will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-420733

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.