Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights

1990 1990 IJCNN International Joint Conference on Neural Networks 1,376 citations

Abstract

The authors describe how a two-layer neural network can approximate any nonlinear function by forming a union of piecewise linear segments. A method is given for picking initial weights for the network to decrease training time. The authors have used the method to initialize adaptive weights over a large number of different training problems and have achieved major improvements in learning speed in every case. The improvement is best when a large number of hidden units is used with a complicated desired response. The authors have used the method to train the truck-backer-upper and were able to decrease the training time from about two days to four hours

Keywords

Artificial neural networkPiecewise linear functionComputer scienceLayer (electronics)Artificial intelligenceNonlinear systemFunction (biology)Activation functionPiecewiseTraining (meteorology)AlgorithmControl theory (sociology)Mathematics

Affiliated Institutions

Related Publications

Network In Network

Abstract: We propose a novel deep network structure called In Network (NIN) to enhance model discriminability for local patches within the receptive field. The conventional con...

2014 arXiv (Cornell University) 1037 citations

Publication Info

Year
1990
Type
article
Pages
21-26 vol.3
Citations
1376
Access
Closed

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1376
OpenAlex
52
Influential
731
CrossRef

Cite This

Dat Nguyen, Bernard Widrow (1990). Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. 1990 IJCNN International Joint Conference on Neural Networks , 21-26 vol.3. https://doi.org/10.1109/ijcnn.1990.137819

Identifiers

DOI
10.1109/ijcnn.1990.137819

Data Quality

Data completeness: 77%