Abstract

Discovering the structure inherent in a set of patterns is a fundamental aim of statistical inference or learning. One fruitful approach is to build a parameterized stochastic generative model, independent draws from which are likely to produce the patterns. For all but the simplest generative models, each pattern can be generated in exponentially many ways. It is thus intractable to adjust the parameters to maximize the probability of the observed patterns. We describe a way of finessing this combinatorial explosion by maximizing an easily computed lower bound on the probability of the observations. Our method can be viewed as a form of hierarchical self-supervised learning that may relate to the function of bottom-up and top-down cortical processing pathways.

Keywords

Generative grammarParameterized complexityGenerative modelComputer scienceSet (abstract data type)InferenceArtificial intelligenceMachine learningStatistical inferenceFunction (biology)Theoretical computer scienceMathematicsAlgorithm

Affiliated Institutions

Related Publications

Deep Learning

Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer ga...

2016 MIT Press eBooks 8918 citations

Publication Info

Year
1995
Type
article
Volume
7
Issue
5
Pages
889-904
Citations
1187
Access
Closed

External Links

Social Impact

Altmetric
PlumX Metrics

Social media, news, blog, policy document mentions

Citation Metrics

1187
OpenAlex

Cite This

Peter Dayan, Geoffrey E. Hinton, Radford M. Neal et al. (1995). The Helmholtz Machine. Neural Computation , 7 (5) , 889-904. https://doi.org/10.1162/neco.1995.7.5.889

Identifiers

DOI
10.1162/neco.1995.7.5.889