Physics – Condensed Matter – Disordered Systems and Neural Networks
Scientific paper
1999-01-21
Physics
Condensed Matter
Disordered Systems and Neural Networks
14 pages including figures. To appear in Physical Review E
Scientific paper
10.1103/PhysRevE.59.4523
Natural gradient descent is a principled method for adapting the parameters of a statistical model on-line using an underlying Riemannian parameter space to redefine the direction of steepest descent. The algorithm is examined via methods of statistical physics which accurately characterize both transient and asymptotic behavior. A solution of the learning dynamics is obtained for the case of multilayer neural network training in the limit of large input dimension. We find that natural gradient learning leads to optimal asymptotic performance and outperforms gradient descent in the transient, significantly shortening or even removing plateaus in the transient generalization performance which typically hamper gradient descent training.
Rattray Magnus
Saad David
No associations
LandOfFree
Analysis of Natural Gradient Descent for Multilayer Neural Networks does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Analysis of Natural Gradient Descent for Multilayer Neural Networks, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Analysis of Natural Gradient Descent for Multilayer Neural Networks will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-545067