You are not currently logged in.
Access JSTOR through your library or other institution:
Marginal Likelihood from the Gibbs Output
Journal of the American Statistical Association
Vol. 90, No. 432 (Dec., 1995), pp. 1313-1321
Stable URL: http://www.jstor.org/stable/2291521
Page Count: 9
Preview not available
In the context of Bayes estimation via Gibbs sampling, with or without data augmentation, a simple approach is developed for computing the marginal density of the sample data (marginal likelihood) given parameter draws from the posterior distribution. Consequently, Bayes factors for model comparisons can be routinely computed as a by-product of the simulation. Hitherto, this calculation has proved extremely challenging. Our approach exploits the fact that the marginal density can be expressed as the prior times the likelihood function over the posterior density. This simple identity holds for any parameter value. An estimate of the posterior density is shown to be available if all complete conditional densities used in the Gibbs sampler have closed-form expressions. To improve accuracy, the posterior density is estimated at a high density point, and the numerical standard error of resulting estimate is derived. The ideas are applied to probit regression and finite mixture models.
Journal of the American Statistical Association © 1995 American Statistical Association