Keywords
Affiliated Institutions
Related Publications
Universal Approximation using Incremental Constructive Feedforward Networks with Random Hidden Nodes
According to conventional neural network theories, single-hidden-layer feedforward networks (SLFNs) with additive or radial basis function (RBF) hidden nodes are universal appro...
Using random weights to train multilayer networks of hard-limiting units
A gradient descent algorithm suitable for training multilayer feedforward networks of processing units with hard-limiting output functions is presented. The conventional backpro...
A direct adaptive method for faster backpropagation learning: the RPROP algorithm
A learning algorithm for multilayer feedforward networks, RPROP (resilient propagation), is proposed. To overcome the inherent disadvantages of pure gradient-descent, RPROP perf...
Training feedforward networks with the Marquardt algorithm
The Marquardt algorithm for nonlinear least squares is presented and is incorporated into the backpropagation algorithm for training feedforward neural networks. The algorithm i...
Publication Info
- Year
- 1990
- Type
- article
- Volume
- 3
- Issue
- 5
- Pages
- 551-560
- Citations
- 2076
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1016/0893-6080(90)90005-6