Abstract
Ever-increasing computational power, along with ever-more sophisticated statistical computing techniques, is making it possible to fit ever-more complex statistical models. Among the more computationally intensive methods, the Gibbs sampler is popular because of its simplicity and power to effectively generate samples from a high-dimensional probability distribution. Despite its simple implementation and description, however, the Gibbs sampler is criticized for its sometimes slow convergence, especially when it is used to fit highly structured complex models. Here we present partially collapsed Gibbs sampling strategies that improve the convergence by capitalizing on a set of functionally incompatible conditional distributions. Such incompatibility generally is avoided in the construction of a Gibbs sampler, because the resulting convergence properties are not well understood. We introduce three basic tools (marginalization, permutation, and trimming) that allow us to transform a Gibbs sampler into a partially collapsed Gibbs sampler with known stationary distribution and faster convergence.
Original language | English |
---|---|
Pages (from-to) | 790-796 |
Number of pages | 7 |
Journal | Journal of the American Statistical Association |
Volume | 103 |
Issue number | 482 |
DOIs | |
Publication status | Published - 2008 Jun |
Bibliographical note
Funding Information:David A. van Dyk is Professor, Department of Statistics, University of California Irvine, Irvine, CA 92697 (E-mail: [email protected]). Taeyoung Park is Assistant Professor, Department of Statistics, University of Pittsburgh, Pittsburgh, PA 15260 (E-mail: [email protected]). This project was supported in part by National Science Foundation grant DMS-01-04129, DMS-04-38240, and DMS-04-06085 and by National Aeronautic and Space Administration contracts NAS8-39073 and NAS8-03060 (CXC).
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Statistics, Probability and Uncertainty