Transformation-Based Learning in the Fast Lane

Computer Science – Computation and Language

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

8 pages, 2 figures, presented at NAACL 2001

Scientific paper

Transformation-based learning has been successfully employed to solve many natural language processing problems. It achieves state-of-the-art performance on many natural language processing tasks and does not overtrain easily. However, it does have a serious drawback: the training time is often intorelably long, especially on the large corpora which are often used in NLP. In this paper, we present a novel and realistic method for speeding up the training time of a transformation-based learner without sacrificing performance. The paper compares and contrasts the training time needed and performance achieved by our modified learner with two other systems: a standard transformation-based learner, and the ICA system \cite{hepple00:tbl}. The results of these experiments show that our system is able to achieve a significant improvement in training time while still achieving the same performance as a standard transformation-based learner. This is a valuable contribution to systems and algorithms which utilize transformation-based learning at any part of the execution.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Transformation-Based Learning in the Fast Lane does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Transformation-Based Learning in the Fast Lane, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Transformation-Based Learning in the Fast Lane will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-642653

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.