Tavis Abrahamsen, Duke University
Convergence analysis of MCMC samplers for Bayesian linear mixed models with p > N For the Bayesian version of the General Linear Mixed Model (GLMM), it is common to assign conditionally conjugate priors to the unknown model parameters. This results in a posterior density that can be explored using a simple two-block Gibbs sampler. It has been shown that when the priors are proper and the X matrix has full column rank, the Markov chains underlying these Gibbs samplers are almost always geometrically ergodic. We generalize this result by allowing for improper priors on the variance components, and, more importantly, by removing all assumptions on the X matrix. We also analyze a Bayesian GLMM where we replace the standard multivariate normal prior on the fixed effects coefficients with a Normal-Gamma shrinkage prior. In this case, our convergence results are for a hybrid sampler that utilizes both a deterministic step and a random scan step.
For both of these models, we derive easily satisfied conditions guaranteeing geometric ergodicity for the corresponding MCMC algorithms. Geometric ergodicity plays a key role in establishing central limit theorems (CLTs) for MCMC-based estimators. Thus, our results are important from a practical standpoint since all of the standard methods of calculating valid asymptotic standard errors are based on the existence of a CLT.