Computer Science – Computation and Language
Scientific paper
2001-07-17
Proceedings of the Second Conference of the North American Chapter of the Association for Computational Linguistics, pages 40-
Computer Science
Computation and Language
8 pages, 2 figures, presented at NAACL 2001
Scientific paper
Transformation-based learning has been successfully employed to solve many natural language processing problems. It achieves state-of-the-art performance on many natural language processing tasks and does not overtrain easily. However, it does have a serious drawback: the training time is often intorelably long, especially on the large corpora which are often used in NLP. In this paper, we present a novel and realistic method for speeding up the training time of a transformation-based learner without sacrificing performance. The paper compares and contrasts the training time needed and performance achieved by our modified learner with two other systems: a standard transformation-based learner, and the ICA system \cite{hepple00:tbl}. The results of these experiments show that our system is able to achieve a significant improvement in training time while still achieving the same performance as a standard transformation-based learner. This is a valuable contribution to systems and algorithms which utilize transformation-based learning at any part of the execution.
Florian Radu
Ngai Grace
No associations
LandOfFree
Transformation-Based Learning in the Fast Lane does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Transformation-Based Learning in the Fast Lane, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Transformation-Based Learning in the Fast Lane will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-642653