Nonlinear Sciences – Chaotic Dynamics
Scientific paper
2007-05-25
Nonlinear Sciences
Chaotic Dynamics
Scientific paper
We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule including passive forgetting and different time scales for neuronal activity and learning dynamics. Previous numerical works have reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on the neural network evolution. Furthermore, we show that the sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.
Berry Hugues
Cessac Bruno
Delord Bruno
Quoy Mathias
Siri Benoit
No associations
LandOfFree
A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-679572