Abstract
An adaptive neural network with asymmetric connections is introduced. This network is related to the Hopfield network with graded neurons and uses a recurrent generalization of the \ensuremath{\delta} rule of Rumelhart, Hinton, and Williams to modify adaptively the synaptic weights. The new network bears a resemblance to the master/slave network of Lapedes and Farber but it is architecturally simpler.
Keywords
Affiliated Institutions
Related Publications
Generalization of Back propagation to Recurrent and Higher Order Neural Networks
A general method for deriving backpropagation algorithms for networks with recurrent and higher order networks is introduced. The propagation of activation in these networks is ...
Neural GPUs Learn Algorithms
Abstract: Learning an algorithm from examples is a fundamental problem that has been widely studied. Recently it has been addressed using neural networks, in particular by Neura...
On the control of automatic processes: A parallel distributed processing account of the Stroop effect.
Traditional views of automaticity are in need of revision. For example, automaticity often has been treated as an all-or-none phenomenon, and traditional theories have held that...
An analysis of noise in recurrent neural networks: convergence and generalization
Concerns the effect of noise on the performance of feedforward neural nets. We introduce and analyze various methods of injecting synaptic noise into dynamically driven recurren...
Computing with Neural Circuits: A Model
A new conceptual framework and a minimization principle together provide an understanding of computation in model neural circuits. The circuits consist of nonlinear graded-respo...
Publication Info
- Year
- 1987
- Type
- article
- Volume
- 59
- Issue
- 19
- Pages
- 2229-2232
- Citations
- 949
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1103/physrevlett.59.2229