Abstract
A learning algorithm is presented that uses internal representations, which are continuous random variables, for the training of multilayer networks whose neurons have Heaviside characteristics. This algorithm is an improvement in that it is applicable to networks with any number of layers of variable weights and does not require 'bit flipping' on the internal representations to reduce output error. The algorithm is extended to apply to recurrent networks. Some illustrative results are given.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">></ETX>
Keywords
Affiliated Institutions
Related Publications
Inserting rules into recurrent neural networks
The authors present a method that incorporates a priori knowledge in the training of recurrent neural networks. This a priori knowledge can be interpreted as hints about the pro...
Self-organising multilayer topographic mappings
Minimization of distortion measures requires multilayer mappings to be topographic. The author shows this only for tree-like multilayer networks. He also shows how to modify the...
Links between Markov models and multilayer perceptrons
The statistical use of a particular classic form of a connectionist system, the multilayer perceptron (MLP), is described in the context of the recognition of continuous speech....
Enhancing supervised learning algorithms via self-organization
A neural network processing scheme is proposed which utilizes a self-organizing Kohonen feature map as the front end to a feedforward classifier network. The results of a series...
Neural network ensembles
Several means for improving the performance and training of neural networks for classification are proposed. Crossvalidation is used as a tool for optimizing network parameters ...
Publication Info
- Year
- 2002
- Type
- article
- Volume
- 2
- Pages
- 1812-1817
- Citations
- 1
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/icnn.1993.298832