Abstract

A general method for deriving backpropagation algorithms for networks with recurrent and higher order networks is introduced. The propagation of activation in these networks is determined by dissipative differential equations. The error signal is backpropagated by integrating an associated differential equation. The method is introduced by applying it to the recurrent generalization of the feedforward backpropagation network. The method is extended to the case of higher order networks and to a constrained dynamical system for training a content addressable memory. The essential feature of the adaptive algorithms is that adaptive equation has a simple outer product form. Preliminary experiments suggest that learning can occur very rapidly in networks with recurrent connections. The continuous formalism makes the new approach more suitable for implementation in VLSI.

Keywords

BackpropagationComputer scienceDissipative systemArtificial neural networkGeneralizationFeed forwardFeedforward neural networkRecurrent neural networkDifferential equationAlgorithmArtificial intelligenceTheoretical computer scienceMathematicsControl engineeringMathematical analysisEngineering

Affiliated Institutions

Related Publications

Long Short-Term Memory

Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We brief...

1997 Neural Computation 90535 citations

Network In Network

Abstract: We propose a novel deep network structure called In Network (NIN) to enhance model discriminability for local patches within the receptive field. The conventional con...

2014 arXiv (Cornell University) 1037 citations

Publication Info

Year
1987
Type
article
Pages
602-611
Citations
123
Access
Closed

External Links

Citation Metrics

123
OpenAlex

Cite This

Fernando J. Pineda (1987). Generalization of Back propagation to Recurrent and Higher Order Neural Networks. Neural Information Processing Systems , 602-611.