Keywords
Affiliated Institutions
Related Publications
Back Propagation is Sensitive to Initial Conditions
This paper explores the effect of initial weight selection on feed-forward networks learning simple functions with the back-propagation technique. We first demonstrate, through ...
Conjugate gradient algorithm for efficient training of artificial neural networks
A novel approach is presented for the training of multilayer feedforward neural networks, using a conjugate gradient algorithm incorporating an appropriate line search algorithm...
Generalization of Back propagation to Recurrent and Higher Order Neural Networks
A general method for deriving backpropagation algorithms for networks with recurrent and higher order networks is introduced. The propagation of activation in these networks is ...
A direct adaptive method for faster backpropagation learning: the RPROP algorithm
A learning algorithm for multilayer feedforward networks, RPROP (resilient propagation), is proposed. To overcome the inherent disadvantages of pure gradient-descent, RPROP perf...
Recent advances in physical reservoir computing: A review
Reservoir computing is a computational framework suited for temporal/sequential data processing. It is derived from several recurrent neural network models, including echo state...
Publication Info
- Year
- 1990
- Type
- book-chapter
- Pages
- 100-109
- Citations
- 48
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1007/3-540-52255-7_31