Theory of Recurrent Neural Network with Common Synaptic Inputs

Physics – Condensed Matter – Disordered Systems and Neural Networks

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

12 pages

Scientific paper

10.1143/JPSJ.74.2961

We discuss the effects of common synaptic inputs in a recurrent neural network. Because of the effects of these common synaptic inputs, the correlation between neural inputs cannot be ignored, and thus the network exhibits sample dependence. Networks of this type do not have well-defined thermodynamic limits, and self-averaging breaks down. We therefore need to develop a suitable theory without relying on these common properties. While the effects of the common synaptic inputs have been analyzed in layered neural networks, it was apparently difficult to analyze these effects in recurrent neural networks due to feedback connections. We investigated a sequential associative memory model as an example of recurrent networks and succeeded in deriving a macroscopic dynamical description as a recurrence relation form of a probability density function.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Theory of Recurrent Neural Network with Common Synaptic Inputs does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Theory of Recurrent Neural Network with Common Synaptic Inputs, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Theory of Recurrent Neural Network with Common Synaptic Inputs will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-561691

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.