Sparse neural networks with large learning diversity

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Coded recurrent neural networks with three levels of sparsity are introduced. The first level is related to the size of messages, much smaller than the number of available neurons. The second one is provided by a particular coding rule, acting as a local constraint in the neural activity. The third one is a characteristic of the low final connection density of the network after the learning phase. Though the proposed network is very simple since it is based on binary neurons and binary connections, it is able to learn a large number of messages and recall them, even in presence of strong erasures. The performance of the network is assessed as a classifier and as an associative memory.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Sparse neural networks with large learning diversity does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Sparse neural networks with large learning diversity, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Sparse neural networks with large learning diversity will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-520828

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.