Keywords
Affiliated Institutions
Related Publications
Universal Approximation using Incremental Constructive Feedforward Networks with Random Hidden Nodes
According to conventional neural network theories, single-hidden-layer feedforward networks (SLFNs) with additive or radial basis function (RBF) hidden nodes are universal appro...
Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators
It is widely known that neural networks (NNs) are universal approximators of continuous functions. However, a less known but powerful result is that a NN with a single hidden la...
Back Propagation is Sensitive to Initial Conditions
This paper explores the effect of initial weight selection on feed-forward networks learning simple functions with the back-propagation technique. We first demonstrate, through ...
Computing with Neural Circuits: A Model
A new conceptual framework and a minimization principle together provide an understanding of computation in model neural circuits. The circuits consist of nonlinear graded-respo...
Publication Info
- Year
- 1989
- Type
- article
- Volume
- 2
- Issue
- 5
- Pages
- 359-366
- Citations
- 20245
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1016/0893-6080(89)90020-8