Computer Science – Information Theory
Scientific paper
2011-06-21
Inquiro, Volume 3, 2009, p. 26 - 29
Computer Science
Information Theory
peer reviewed and published in undergraduate research journal Inquiro in 2009 after Summer work in 2009
Scientific paper
Huffman compression is a statistical, lossless, data compression algorithm that compresses data by assigning variable length codes to symbols, with the more frequently appearing symbols given shorter codes than the less. This work is a modification of the Huffman algorithm which permits uncompressed data to be decomposed into indepen- dently compressible and decompressible blocks, allowing for concurrent compression and decompression on multiple processors. We create implementations of this modified algorithm on a current NVIDIA GPU using the CUDA API as well as on a current Intel chip and the performance results are compared, showing favorable GPU performance for nearly all tests. Lastly, we discuss the necessity for high performance data compression in today's supercomputing ecosystem.
Bangalore P.
Cloud Robert Louis
Curry M. L.
Skjellum A.
Ward Henry L.
No associations
LandOfFree
Accelerating Lossless Data Compression with GPUs does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Accelerating Lossless Data Compression with GPUs, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Accelerating Lossless Data Compression with GPUs will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-207464