Computer Science – Computation and Language
Scientific paper
1998-01-26
Proceedings of NeMLaP3/CoNLL98, 195-204
Computer Science
Computation and Language
uses conll98, epsf, and ipamacs (WSU IPA)
Scientific paper
Memory-based learning, keeping full memory of learning material, appears a viable approach to learning NLP tasks, and is often superior in generalisation accuracy to eager learning approaches that abstract from learning material. Here we investigate three partial memory-based learning approaches which remove from memory specific task instance types estimated to be exceptional. The three approaches each implement one heuristic function for estimating exceptionality of instance types: (i) typicality, (ii) class prediction strength, and (iii) friendly-neighbourhood size. Experiments are performed with the memory-based learning algorithm IB1-IG trained on English word pronunciation. We find that removing instance types with low prediction strength (ii) is the only tested method which does not seriously harm generalisation accuracy. We conclude that keeping full memory of types rather than tokens, and excluding minority ambiguities appear to be the only performance-preserving optimisations of memory-based learning.
Daelemans Walter
den Bosch Antal van
No associations
LandOfFree
Do not forget: Full memory in memory-based learning of word pronunciation does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Do not forget: Full memory in memory-based learning of word pronunciation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Do not forget: Full memory in memory-based learning of word pronunciation will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-375405