Abstract
The authors present a method that incorporates a priori knowledge in the training of recurrent neural networks. This a priori knowledge can be interpreted as hints about the problem to be learned and these hints are encoded as rules which are then inserted into the neural network. The authors demonstrate the approach by training recurrent neural networks with inserted rules to learn to recognize regular languages from grammatical string examples. Because the recurrent networks have second-order connections, rule-insertion is a straightforward mapping of rules into weights and neurons. Simulations show that training recurrent networks with different amounts of partial knowledge to recognize simple grammers improves the training times by orders of magnitude, even when only a small fraction of all transitions are inserted as rules. In addition, there appears to be no loss in generalization performance.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">></ETX>
Keywords
Affiliated Institutions
Related Publications
The problem of learning long-term dependencies in recurrent networks
The authors seek to train recurrent neural networks in order to map input sequences to output sequences, for applications in sequence recognition or production. Results are pres...
Neural network ensembles
Several means for improving the performance and training of neural networks for classification are proposed. Crossvalidation is used as a tool for optimizing network parameters ...
A method of training multi-layer networks with heaviside characteristics using internal representations
A learning algorithm is presented that uses internal representations, which are continuous random variables, for the training of multilayer networks whose neurons have Heaviside...
BPS: a learning algorithm for capturing the dynamic nature of speech
A novel backpropagation learning algorithm for a particular class of dynamic neural networks in which some units have a local feedback is proposed. Hence these networks can be t...
Qualitative analysis and synthesis of a class of neural networks
The dynamic properties of a class of neural networks (which includes the Hopfield model as a special case) are investigated by studying the qualitative behavior of equilibrium p...
Publication Info
- Year
- 2003
- Type
- article
- Volume
- 1
- Pages
- 13-22
- Citations
- 28
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/nnsp.1992.253712