Computer Science – Computation and Language
Scientific paper
1997-06-09
Computer Science
Computation and Language
9 pages, uses aclap.sty
Scientific paper
Learning problems in the text processing domain often map the text to a space whose dimensions are the measured features of the text, e.g., its words. Three characteristic properties of this domain are (a) very high dimensionality, (b) both the learned concepts and the instances reside very sparsely in the feature space, and (c) a high variation in the number of active features in an instance. In this work we study three mistake-driven learning algorithms for a typical task of this nature -- text categorization. We argue that these algorithms -- which categorize documents by learning a linear separator in the feature space -- have a few properties that make them ideal for this domain. We then show that a quantum leap in performance is achieved when we further modify the algorithms to better address some of the specific characteristics of the domain. In particular, we demonstrate (1) how variation in document length can be tolerated by either normalizing feature weights or by using negative weights, (2) the positive effect of applying a threshold range in training, (3) alternatives in considering feature frequency, and (4) the benefits of discarding features while training. Overall, we present an algorithm, a variation of Littlestone's Winnow, which performs significantly better than any other algorithm tested on this task using a similar feature set.
Dagan Ido
Karov Yael
Roth D. D.
No associations
LandOfFree
Mistake-Driven Learning in Text Categorization does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Mistake-Driven Learning in Text Categorization, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Mistake-Driven Learning in Text Categorization will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-102158