Feed-Forward Chains of Recurrent Attractor Neural Networks Near Saturation

Physics – Condensed Matter

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

14 pages LaTex with 4 postscript figures submitted to J. Phys. A

Scientific paper

10.1088/0305-4470/29/24/011

We perform a stationary state replica analysis for a layered network of Ising spin neurons, with recurrent Hebbian interactions within each layer, in combination with strictly feed-forward Hebbian interactions between successive layers. This model interpolates between the fully recurrent and symmetric attractor network studied by Amit el al, and the strictly feed-forward attractor network studied by Domany et al. Due to the absence of detailed balance, it is as yet solvable only in the zero temperature limit. The built-in competition between two qualitatively different modes of operation, feed-forward (ergodic within layers) versus recurrent (non- ergodic within layers), is found to induce interesting phase transitions.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Feed-Forward Chains of Recurrent Attractor Neural Networks Near Saturation does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Feed-Forward Chains of Recurrent Attractor Neural Networks Near Saturation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Feed-Forward Chains of Recurrent Attractor Neural Networks Near Saturation will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-266118

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.