Abstract

The statistical use of a particular classic form of a connectionist system, the multilayer perceptron (MLP), is described in the context of the recognition of continuous speech. A discriminant hidden Markov model (HMM) is defined, and it is shown how a particular MLP with contextual and extra feedback input units can be considered as a general form of such a Markov model. A link between these discriminant HMMs, trained along the Viterbi algorithm, and any other approach based on least mean square minimization of an error function (LMSE) is established. It is shown theoretically and experimentally that the outputs of the MLP (when trained along the LMSE or the entropy criterion) approximate the probability distribution over output classes conditioned on the input, i.e. the maximum a posteriori probabilities. Results of a series of speech recognition experiments are reported. The possibility of embedding MLP into HMM is described. Relations with other recurrent networks are also explained.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Keywords

Hidden Markov modelViterbi algorithmPattern recognition (psychology)Computer scienceArtificial intelligenceMaximum-entropy Markov modelMultilayer perceptronDiscriminantConnectionismPerceptronMarkov chainArtificial neural networkMarkov modelMarkov processSpeech recognitionEntropy (arrow of time)Context (archaeology)Maximum a posteriori estimationMachine learningMathematicsVariable-order Markov modelMaximum likelihoodStatistics

Affiliated Institutions

Related Publications

Publication Info

Year
1990
Type
article
Volume
12
Issue
12
Pages
1167-1178
Citations
340
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

340
OpenAlex

Cite This

H. Bourlard, C. Wellekens (1990). Links between Markov models and multilayer perceptrons. IEEE Transactions on Pattern Analysis and Machine Intelligence , 12 (12) , 1167-1178. https://doi.org/10.1109/34.62605

Identifiers

DOI
10.1109/34.62605