Computer Science – Neural and Evolutionary Computing
Scientific paper
2007-10-23
Computer Science
Neural and Evolutionary Computing
Scientific paper
Cellular Simultaneous Recurrent Neural Network (SRN) has been shown to be a function approximator more powerful than the MLP. This means that the complexity of MLP would be prohibitively large for some problems while SRN could realize the desired mapping with acceptable computational constraints. The speed of training of complex recurrent networks is crucial to their successful application. Present work improves the previous results by training the network with extended Kalman filter (EKF). We implemented a generic Cellular SRN and applied it for solving two challenging problems: 2D maze navigation and a subset of the connectedness problem. The speed of convergence has been improved by several orders of magnitude in comparison with the earlier results in the case of maze navigation, and superior generalization has been demonstrated in the case of connectedness. The implications of this improvements are discussed.
Ilin Roman
Kozma Robert
Werbos Paul J.
No associations
LandOfFree
Beyond Feedforward Models Trained by Backpropagation: a Practical Training Tool for a More Efficient Universal Approximator does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Beyond Feedforward Models Trained by Backpropagation: a Practical Training Tool for a More Efficient Universal Approximator, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Beyond Feedforward Models Trained by Backpropagation: a Practical Training Tool for a More Efficient Universal Approximator will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-455712