Abstract

In the past ten years there has been a dramatic increase of interest in the Bayesian analysis of finite mixture models. This is primarily because of the emergence of Markov chain Monte Carlo (MCMC) methods. While MCMC provides a convenient way to draw inference from complicated statistical models, there are many, perhaps underappreciated, problems associated with the MCMC analysis of mixtures. The problems are mainly caused by the nonidentifiability of the components under symmetric priors, which leads to so-called label switching in the MCMC output. This means that ergodic averages of component specific quantities will be identical and thus useless for inference. We review the solutions to the label switching problem, such as artificial identifiability constraints, relabelling algorithms and label invariant loss functions. We also review various MCMC sampling schemes that have been suggested for mixture models and discuss posterior sensitivity to prior specification.

Keywords

Markov chain Monte CarloIdentifiabilityPrior probabilityComputer scienceBayesian inferenceBayesian probabilityGibbs samplingMarkov chainAlgorithmInferenceArtificial intelligenceMathematicsMachine learning

Affiliated Institutions

Related Publications

Publication Info

Year
2005
Type
article
Volume
20
Issue
1
Citations
671
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

671
OpenAlex

Cite This

Ajay Jasra, Chris Holmes, David A. Stephens (2005). Markov Chain Monte Carlo Methods and the Label Switching Problem in Bayesian Mixture Modeling. Statistical Science , 20 (1) . https://doi.org/10.1214/088342305000000016

Identifiers

DOI
10.1214/088342305000000016