Using Feature Weights to Improve Performance of Neural Networks

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

2 tables, 4 figures

Scientific paper

Different features have different relevance to a particular learning problem. Some features are less relevant; while some very important. Instead of selecting the most relevant features using feature selection, an algorithm can be given this knowledge of feature importance based on expert opinion or prior learning. Learning can be faster and more accurate if learners take feature importance into account. Correlation aided Neural Networks (CANN) is presented which is such an algorithm. CANN treats feature importance as the correlation coefficient between the target attribute and the features. CANN modifies normal feed-forward Neural Network to fit both correlation values and training data. Empirical evaluation shows that CANN is faster and more accurate than applying the two step approach of feature selection and then using normal learning algorithms.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Using Feature Weights to Improve Performance of Neural Networks does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Using Feature Weights to Improve Performance of Neural Networks, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Using Feature Weights to Improve Performance of Neural Networks will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-418090

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.