Abstract
For neural networks with a wide class of weight priors, it can be shown that in the limit of an infinite number of hidden units, the prior over functions tends to a gaussian process. In this article, analytic forms are derived for the covariance function of the gaussian processes corresponding to networks with sigmoidal and gaussian hidden units. This allows predictions to be made efficiently using networks with an infinite number of hidden units and shows, somewhat paradoxically, that it may be easier to carry out Bayesian prediction with infinite networks rather than finite ones.
Keywords
Affiliated Institutions
Related Publications
Rectified Linear Units Improve Restricted Boltzmann Machines
Restricted Boltzmann machines were developed using binary stochastic hidden units. These can be generalized by replacing each binary unit by an infinite number of copies that al...
A Fast Learning Algorithm for Deep Belief Nets
We show how to use “complementary priors” to eliminate the explaining-away effects that make inference difficult in densely connected belief nets that have many hidden layers. U...
Gaussian regression and optimal finite dimensional linear models
The problem of regression under Gaussian assumptions is treated generally. The relationship between Bayesian prediction, regularization and smoothing is elucidated. The ideal re...
Finite-Dimensional Approximation of Gaussian Processes
Gaussian process (GP) prediction suffers from O(n3) scaling with the data set size n. By using a finite-dimensional basis to approximate the GP predictor, the computational comp...
Practical Bayesian Density Estimation Using Mixtures of Normals
Abstract Mixtures of normals provide a flexible model for estimating densities in a Bayesian framework. There are some difficulties with this model, however. First, standard ref...
Publication Info
- Year
- 1998
- Type
- article
- Volume
- 10
- Issue
- 5
- Pages
- 1203-1216
- Citations
- 150
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1162/089976698300017412