A Practical Bayesian Framework for Backpropagation Networks

1992 Neural Computation 2,841 citations

Abstract

A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks. The framework makes possible (1) objective comparisons between solutions using alternative network architectures, (2) objective stopping rules for network pruning or growing procedures, (3) objective choice of magnitude and type of weight decay terms or additive regularizers (for penalizing large weights, etc.), (4) a measure of the effective number of well-determined parameters in a model, (5) quantified estimates of the error bars on network parameters and on network output, and (6) objective comparisons with alternative learning and interpolation models such as splines and radial basis functions. The Bayesian "evidence" automatically embodies "Occam's razor," penalizing overflexible and overcomplex models. The Bayesian approach helps detect poor underlying assumptions in learning models. For learning models well matched to a problem, a good correlation between generalization ability and the Bayesian evidence is obtained.

Keywords

PruningoccamComputer scienceGeneralizationVariable-order Bayesian networkBackpropagationBayesian probabilityArtificial intelligenceBayesian networkMachine learningDynamic Bayesian networkMeasure (data warehouse)Interpolation (computer graphics)Artificial neural networkOccam's razorAlgorithmMathematicsBayesian inferenceData miningStatistics

Affiliated Institutions

Related Publications

Multiple Imputation after 18+ Years

Abstract Multiple imputation was designed to handle the problem of missing data in public-use data bases where the data-base constructor and the ultimate user are distinct entit...

1996 Journal of the American Statistical A... 2846 citations

Pruning algorithms-a survey

A rule of thumb for obtaining good generalization in systems trained by examples is that one should use the smallest system that will fit the data. Unfortunately, it usually is ...

1993 IEEE Transactions on Neural Networks 1695 citations

Publication Info

Year
1992
Type
article
Volume
4
Issue
3
Pages
448-472
Citations
2841
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

2841
OpenAlex

Cite This

David Mackay (1992). A Practical Bayesian Framework for Backpropagation Networks. Neural Computation , 4 (3) , 448-472. https://doi.org/10.1162/neco.1992.4.3.448

Identifiers

DOI
10.1162/neco.1992.4.3.448