Physics – Condensed Matter
Scientific paper
1993-08-04
Physics
Condensed Matter
14 pages, OKHEP 93-004
Scientific paper
10.1016/0375-9601(94)90570-3
Neural networks with synaptic weights constructed according to the weighted Hebb rule, a variant of the familiar Hebb rule, are studied in the presence of noise(finite temperature), when the number of stored patterns is finite and in the limit that the number of neurons $N\rightarrow \infty$. The fact that different patterns enter the synaptic rule with different weights changes the configuration of the free energy surface. For a general choice of weights not all of the patterns are stored as {\sl global} minima of the free energy function. However, as for the case of the usual Hebb rule, there exists a temperature range in which only the stored patterns are minima of the free energy. In particular, in the presence of a single extra pattern stored with an appropriate weight in the synaptic rule, the temperature at which the spurious minima of the free energy are eliminated is significantly lower than for a similar network without this extra pattern. The convergence time of the network, together with the overlaps of the equilibria of the network with the stored patterns, can thereby be improved considerably.
Marzban Caren
Viswanathan Raju
No associations
LandOfFree
Stochastic Neural Networks with the Weighted Hebb Rule does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Stochastic Neural Networks with the Weighted Hebb Rule, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Stochastic Neural Networks with the Weighted Hebb Rule will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-373857