Abstract
This special issue illustrates both the scientific trends of the early work in recurrent neural networks, and the mathematics of training when at least some recurrent terms of the network derivatives can be non-zero. Herein is a brief description of each of the papers. We have organized this description into two parts. The first part contains the papers that are mainly theoretical, and the second part contains the papers that are mainly applications. The order of papers is alphabetical by first author.
Keywords
Affiliated Institutions
Related Publications
Generalization of Back propagation to Recurrent and Higher Order Neural Networks
A general method for deriving backpropagation algorithms for networks with recurrent and higher order networks is introduced. The propagation of activation in these networks is ...
Conditional Random Fields as Recurrent Neural Networks
Pixel-level labelling tasks, such as semantic segmentation, play a central role in image understanding. Recent approaches have attempted to harness the capabilities of deep lear...
A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures
Recurrent neural networks (RNNs) have been widely adopted in research areas concerned with sequential data, such as text, audio, and video. However, RNNs consisting of sigma cel...
Training Very Deep Networks
Theoretical and empirical evidence indicates that the depth of neural networks is crucial for their success. However, training becomes more difficult as depth increases, and tra...
Publication Info
- Year
- 1994
- Type
- article
- Volume
- 5
- Issue
- 2
- Pages
- 153-156
- Citations
- 199
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/tnn.1994.8753425