Abstract
Artificial neural net models have been studied for many years in the hope of achieving human-like performance in the fields of speech and image recognition. These models are composed of many nonlinear computational elements operating in parallel and arranged in patterns reminiscent of biological neural nets. Computational elements or nodes are connected via weights that are typically adapted during use to improve performance. There has been a recent resurgence in the field of artificial neural nets caused by new net topologies and algorithms, analog VLSI implementation techniques, and the belief that massive parallelism is essential for high performance speech and image recognition. This paper provides an introduction to the field of artificial neural nets by reviewing six important neural net models that can be used for pattern classification. These nets are highly parallel building blocks that illustrate neural net components and design principles and can be used to construct more complex systems. In addition to describing these nets, a major emphasis is placed on exploring how some existing classification and clustering algorithms can be performed using simple neuron-like components. Single-layer nets can implement algorithms required by Gaussian maximum-likelihood classifiers and optimum minimum-error classifiers for binary patterns corrupted by noise. More generally, the decision regions required by any classification algorithm can be generated in a straightforward manner by three-layer feed-forward nets.
Keywords
Affiliated Institutions
Related Publications
Layered neural nets for pattern recognition
A pattern recognition concept involving first an 'invariance net' and second a 'trainable classifier' is proposed. The invariance net can be trained or designed to produce a set...
An application of recurrent nets to phone probability estimation
This paper presents an application of recurrent networks for phone probability estimation in large vocabulary speech recognition. The need for efficient exploitation of context ...
Overfitting in Neural Nets: Backpropagation, Conjugate Gradient, and Early Stopping
The conventional wisdom is that backprop nets with excess hidden units generalize poorly. We show that nets with excess capacity generalize well when trained with backprop and e...
Multi-Label Image Recognition With Graph Convolutional Networks
The task of multi-label image recognition is to predict a set of object labels that present in an image. As objects normally co-occur in an image, it is desirable to model the l...
Arcing classifier (with discussion and a rejoinder by the author)
Recent work has shown that combining multiple versions of unstable\nclassifiers such as trees or neural nets results in reduced test set error. One\nof the more effective is bag...
Publication Info
- Year
- 1987
- Type
- article
- Volume
- 4
- Issue
- 2
- Pages
- 4-22
- Citations
- 7829
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/massp.1987.1165576