Accelerating Lossless Data Compression with GPUs

Computer Science – Information Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

peer reviewed and published in undergraduate research journal Inquiro in 2009 after Summer work in 2009

Scientific paper

Huffman compression is a statistical, lossless, data compression algorithm that compresses data by assigning variable length codes to symbols, with the more frequently appearing symbols given shorter codes than the less. This work is a modification of the Huffman algorithm which permits uncompressed data to be decomposed into indepen- dently compressible and decompressible blocks, allowing for concurrent compression and decompression on multiple processors. We create implementations of this modified algorithm on a current NVIDIA GPU using the CUDA API as well as on a current Intel chip and the performance results are compared, showing favorable GPU performance for nearly all tests. Lastly, we discuss the necessity for high performance data compression in today's supercomputing ecosystem.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Accelerating Lossless Data Compression with GPUs does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Accelerating Lossless Data Compression with GPUs, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Accelerating Lossless Data Compression with GPUs will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-207464

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.