Abstract

An adaptive neural network with asymmetric connections is introduced. This network is related to the Hopfield network with graded neurons and uses a recurrent generalization of the \ensuremath{\delta} rule of Rumelhart, Hinton, and Williams to modify adaptively the synaptic weights. The new network bears a resemblance to the master/slave network of Lapedes and Farber but it is architecturally simpler.

Keywords

GeneralizationHopfield networkArtificial neural networkComputer scienceRecurrent neural networkArtificial intelligenceBackpropagationTypes of artificial neural networksMathematics

Affiliated Institutions

Related Publications

Neural GPUs Learn Algorithms

Abstract: Learning an algorithm from examples is a fundamental problem that has been widely studied. Recently it has been addressed using neural networks, in particular by Neura...

2016 arXiv (Cornell University) 63 citations

Publication Info

Year
1987
Type
article
Volume
59
Issue
19
Pages
2229-2232
Citations
949
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

949
OpenAlex

Cite This

Fernando J. Pineda (1987). Generalization of back-propagation to recurrent neural networks. Physical Review Letters , 59 (19) , 2229-2232. https://doi.org/10.1103/physrevlett.59.2229

Identifiers

DOI
10.1103/physrevlett.59.2229