Physics – Condensed Matter
Scientific paper
1996-06-27
Physics
Condensed Matter
14 pages LaTex with 4 postscript figures submitted to J. Phys. A
Scientific paper
10.1088/0305-4470/29/24/011
We perform a stationary state replica analysis for a layered network of Ising spin neurons, with recurrent Hebbian interactions within each layer, in combination with strictly feed-forward Hebbian interactions between successive layers. This model interpolates between the fully recurrent and symmetric attractor network studied by Amit el al, and the strictly feed-forward attractor network studied by Domany et al. Due to the absence of detailed balance, it is as yet solvable only in the zero temperature limit. The built-in competition between two qualitatively different modes of operation, feed-forward (ergodic within layers) versus recurrent (non- ergodic within layers), is found to induce interesting phase transitions.
Coolen Anthony C. C.
Viana L.
No associations
LandOfFree
Feed-Forward Chains of Recurrent Attractor Neural Networks Near Saturation does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Feed-Forward Chains of Recurrent Attractor Neural Networks Near Saturation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Feed-Forward Chains of Recurrent Attractor Neural Networks Near Saturation will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-266118