Using the Observations to Estimate the Prior Distribution

1965 Journal of the Royal Statistical Society Series B (Statistical Methodology) 4 citations

Abstract

Summary A parameter μ is to be estimated from an unbiased, unit-variance measurement x. The prior distribution is symmetrical about zero with unknown scale parameter σ. It is claimed that the observation x gives information about σ which is contained in the likelihood distribution p(dx|σ). An estimator μ̂ = γx is found, based on the posterior distribution of μ conditional on the most probable value of σ. For a prior distribution of arbitrary shape, γ is zero when x2 ≤ 1 and approaches 1 – x–2 when x2 ≥ 1. A minimum mean-square estimator is found based on an estimate of the prior variance. This gives the same result, independently of the shape of the prior distribution, as the previous estimator for a normal prior.

Keywords

MathematicsStatisticsEstimatorPrior probabilityBias of an estimatorMinimum-variance unbiased estimatorDistribution (mathematics)Variance (accounting)Scale parameterShape parameterNormal distributionScale (ratio)Posterior predictive distributionTrimmed estimatorBayesian linear regressionMathematical analysisPhysicsBayesian probabilityBayesian inference

Affiliated Institutions

Related Publications

Publication Info

Year
1965
Type
article
Volume
27
Issue
1
Pages
17-27
Citations
4
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

4
OpenAlex

Cite This

M. Clutton-Brock (1965). Using the Observations to Estimate the Prior Distribution. Journal of the Royal Statistical Society Series B (Statistical Methodology) , 27 (1) , 17-27. https://doi.org/10.1111/j.2517-6161.1965.tb00582.x

Identifiers

DOI
10.1111/j.2517-6161.1965.tb00582.x