Keywords
Related Publications
Universal Approximation using Incremental Constructive Feedforward Networks with Random Hidden Nodes
According to conventional neural network theories, single-hidden-layer feedforward networks (SLFNs) with additive or radial basis function (RBF) hidden nodes are universal appro...
Conjugate gradient algorithm for efficient training of artificial neural networks
A novel approach is presented for the training of multilayer feedforward neural networks, using a conjugate gradient algorithm incorporating an appropriate line search algorithm...
Back Propagation is Sensitive to Initial Conditions
This paper explores the effect of initial weight selection on feed-forward networks learning simple functions with the back-propagation technique. We first demonstrate, through ...
An adaptive least squares algorithm for the efficient training of artificial neural networks
A novel learning algorithm is developed for the training of multilayer feedforward neural networks, based on a modification of the Marquardt-Levenberg least-squares optimization...
A direct adaptive method for faster backpropagation learning: the RPROP algorithm
A learning algorithm for multilayer feedforward networks, RPROP (resilient propagation), is proposed. To overcome the inherent disadvantages of pure gradient-descent, RPROP perf...
Publication Info
- Year
- 1989
- Type
- article
- Citations
- 9346
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.5555/70405.70408