Sankhya: The Indian Journal of Statistics

2006, Volume 68, Pt. 4, 542--553

Bayesian Inference via a Minimization Rule

S.G. Walker, University of Kent, Canterbury, UK

SUMMARY. In this paper, we consider the Bayesian posterior distribution as the solution to a minimization rule, first observed by Zellner (1988). The expression to be minimized is a mixture of two pieces, one piece involving the prior distribution, which is minimized by the prior, and the other piece involves the data, which is minimized by the measure putting all the mass on the maximum likelihood estimator. From this perspective of the posterior distribution, Bayesian model selection and the search for an objective prior distribution, can be viewed in a way which is different from usual Bayesian approaches.

AMS (2000) subject classification. Primary 62F15.

Key words and phrases. Kullback-Leibler divergence, maximum likelihood estimator, minimization, model selection, objective prior, prior, posterior.

Full paper (PDF)