Abstract

The size of the time intervals between events conveys information essential for numerous sequential tasks such as motor control and rhythm detection. While hidden Markov models tend to ignore this information, recurrent neural networks (RNN) can in principle learn to make use of it. We focus on long short-term memory (LSTM) because it usually outperforms other RNN. Surprisingly, LSTM augmented by "peephole connections" from its internal cells to its multiplicative gates can learn the fine distinction between sequences of spikes separated by either 50 or 49 discrete time steps, without the help of any short training exemplars. Without external resets or teacher forcing or loss of performance on tasks reported earlier, our LSTM variant also learns to generate very stable sequences of highly nonlinear, precisely timed spikes. This makes LSTM a promising approach for real-world tasks that require to time and count.

Keywords

Computer scienceRecurrent neural networkFocus (optics)Hidden Markov modelArtificial intelligenceMultiplicative functionForcing (mathematics)Speech recognitionMachine learningArtificial neural networkMathematics

Affiliated Institutions

Related Publications

Long Short-Term Memory

Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We brief...

1997 Neural Computation 90535 citations

Publication Info

Year
2000
Type
article
Pages
189-194 vol.3
Citations
627
Access
Closed

Social Impact

Altmetric

Social media, news, blog, policy document mentions

Citation Metrics

627
OpenAlex
27
Influential
380
CrossRef

Cite This

Felix A. Gers, Jürgen Schmidhuber (2000). Recurrent nets that time and count. Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium , 189-194 vol.3. https://doi.org/10.1109/ijcnn.2000.861302

Identifiers

DOI
10.1109/ijcnn.2000.861302

Data Quality

Data completeness: 77%