A Machine Learning Perspective on Predictive Coding with PAQ

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

PAQ8 is an open source lossless data compression algorithm that currently achieves the best compression rates on many benchmarks. This report presents a detailed description of PAQ8 from a statistical machine learning perspective. It shows that it is possible to understand some of the modules of PAQ8 and use this understanding to improve the method. However, intuitive statistical explanations of the behavior of other modules remain elusive. We hope the description in this report will be a starting point for discussions that will increase our understanding, lead to improvements to PAQ8, and facilitate a transfer of knowledge from PAQ8 to other machine learning methods, such a recurrent neural networks and stochastic memoizers. Finally, the report presents a broad range of new applications of PAQ to machine learning tasks including language modeling and adaptive text prediction, adaptive game playing, classification, and compression using features from the field of deep learning.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

A Machine Learning Perspective on Predictive Coding with PAQ does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with A Machine Learning Perspective on Predictive Coding with PAQ, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A Machine Learning Perspective on Predictive Coding with PAQ will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-196405

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.