Computer Science – Learning
Scientific paper
2010-05-30
Computer Science
Learning
9 pages, 1 figures, 3 tables
Scientific paper
Standard hybrid learners that use domain knowledge require stronger knowledge that is hard and expensive to acquire. However, weaker domain knowledge can benefit from prior knowledge while being cost effective. Weak knowledge in the form of feature relative importance (FRI) is presented and explained. Feature relative importance is a real valued approximation of a feature's importance provided by experts. Advantage of using this knowledge is demonstrated by IANN, a modified multilayer neural network algorithm. IANN is a very simple modification of standard neural network algorithm but attains significant performance gains. Experimental results in the field of molecular biology show higher performance over other empirical learning algorithms including standard backpropagation and support vector machines. IANN performance is even comparable to a theory refinement system KBANN that uses stronger domain knowledge. This shows Feature relative importance can improve performance of existing empirical learning algorithms significantly with minimal effort.
No associations
LandOfFree
Empirical learning aided by weak domain knowledge in the form of feature importance does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Empirical learning aided by weak domain knowledge in the form of feature importance, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Empirical learning aided by weak domain knowledge in the form of feature importance will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-162138