Abstract

SUMMARY A formal Bayesian analysis of a mixture model usually leads to intractable calculations, since the posterior distribution takes into account all the partitions of the sample. We present approximation methods which evaluate the posterior distribution and Bayes estimators by Gibbs sampling, relying on the missing data structure of the mixture model. The data augmentation method is shown to converge geometrically, since a duality principle transfers properties from the discrete missing data chain to the parameters. The fully conditional Gibbs alternative is shown to be ergodic and geometric convergence is established in the normal case. We also consider non-informative approximations associated with improper priors, assuming that the sample corresponds exactly to a k-component mixture.

Keywords

Bayesian probabilityMathematicsStatisticsSampling (signal processing)EstimationBayes estimatorComputer scienceApplied mathematicsEngineering

Affiliated Institutions

Related Publications

Applied Missing Data Analysis

Part 1. An Introduction to Missing Data. 1.1 Introduction. 1.2 Chapter Overview. 1.3 Missing Data Patterns. 1.4 A Conceptual Overview of Missing Data heory. 1.5 A More Formal De...

2010 6888 citations

Publication Info

Year
1994
Type
article
Volume
56
Issue
2
Pages
363-375
Citations
904
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

904
OpenAlex

Cite This

Jean Diebolt, Christian P. Robert (1994). Estimation of Finite Mixture Distributions Through Bayesian Sampling. Journal of the Royal Statistical Society Series B (Statistical Methodology) , 56 (2) , 363-375. https://doi.org/10.1111/j.2517-6161.1994.tb01985.x

Identifiers

DOI
10.1111/j.2517-6161.1994.tb01985.x