If you need an accessible version of this item please contact JSTOR User Support

Marginal Likelihood from the Gibbs Output

Siddhartha Chib
Journal of the American Statistical Association
Vol. 90, No. 432 (Dec., 1995), pp. 1313-1321
DOI: 10.2307/2291521
Stable URL: http://www.jstor.org/stable/2291521
Page Count: 9
  • Download PDF
  • Cite this Item

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support
Marginal Likelihood from the Gibbs Output
Preview not available

Abstract

In the context of Bayes estimation via Gibbs sampling, with or without data augmentation, a simple approach is developed for computing the marginal density of the sample data (marginal likelihood) given parameter draws from the posterior distribution. Consequently, Bayes factors for model comparisons can be routinely computed as a by-product of the simulation. Hitherto, this calculation has proved extremely challenging. Our approach exploits the fact that the marginal density can be expressed as the prior times the likelihood function over the posterior density. This simple identity holds for any parameter value. An estimate of the posterior density is shown to be available if all complete conditional densities used in the Gibbs sampler have closed-form expressions. To improve accuracy, the posterior density is estimated at a high density point, and the numerical standard error of resulting estimate is derived. The ideas are applied to probit regression and finite mixture models.

Page Thumbnails

  • Thumbnail: Page 
1313
    1313
  • Thumbnail: Page 
1314
    1314
  • Thumbnail: Page 
1315
    1315
  • Thumbnail: Page 
1316
    1316
  • Thumbnail: Page 
1317
    1317
  • Thumbnail: Page 
1318
    1318
  • Thumbnail: Page 
1319
    1319
  • Thumbnail: Page 
1320
    1320
  • Thumbnail: Page 
1321
    1321