We derive conditions that guarantee the stability of Markov Chains in the presence of stochastic model error. To do this, we adapt existing theory on the convergence of perturbed stable Markov Chains under roundoff error. We apply the results to Markov Chain Monte Carlo (MCMC) algorithms, which are widely used to construct a stochastic process whose limiting distribution is the unknown distribution of interest in a given problem. For example, they are used for Bayesian Calibration, which is the problem of determining a distribution for parameters in a physical model from noisy observations on the output of the model using a Bayesian approach. In practice, errors in the computer simulations that affect the MCMC samples, in particular having a significant effect on convergence and accuracy. Finally, we also model the error of perturbed MCMC samples using a time series model.
Copyright is held by the author(s).
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Supervisor or Senior Supervisor
Thesis advisor: Estep, Donald
Member of collection