Computer Science – Computation and Language
Scientific paper
2001-08-09
Proceedings of ICASSP-2001, Utah, May 2001
Computer Science
Computation and Language
4 pages
Scientific paper
Maximum entropy models are considered by many to be one of the most promising avenues of language modeling research. Unfortunately, long training times make maximum entropy research difficult. We present a novel speedup technique: we change the form of the model to use classes. Our speedup works by creating two maximum entropy models, the first of which predicts the class of each word, and the second of which predicts the word itself. This factoring of the model leads to fewer non-zero indicator functions, and faster normalization, achieving speedups of up to a factor of 35 over one of the best previous techniques. It also results in typically slightly lower perplexities. The same trick can be used to speed training of other machine learning techniques, e.g. neural networks, applied to any problem with a large number of outputs, such as language modeling.
No associations
LandOfFree
Classes for Fast Maximum Entropy Training does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Classes for Fast Maximum Entropy Training, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Classes for Fast Maximum Entropy Training will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-613262