October 21, 2013

Simulation III: Monte Carlo and Markov Chain Monte Carlo (Introduction to Statistical Computing)

Lecture 16: The Monte Carlo principle for numerical integrals: write your integral as an expectation, take a sample. Examples. Importance sampling: draw from a distribution other than the one you really are want, then weight the sample values. Markov chain Monte Carlo for sampling from a distribution we do not completely know: the Metropolis algorithm. Gibbs sampling. Bayesian inference via MCMC.

Readings: Handouts on Markov Chains and Monte Carlo, and on Markov Chain Monte Carlo

Optional readings: Charles Geyer, "Practical Markov Chain Monte Carlo", Statistical Science 7 (1992): 473--483; "One Long Run"; Burn-In is Unnecessary; On the Bogosity of MCMC Diagnostics.

Update, 22 December: If you do read Geyer, it's also worth reading two posts by Andrew Gelman (A Centipede Many Times Over and A Tale of Two Discussion Papers), and Gelman and Rubin's "Inference from Iterative Simulation Using Multiple Sequences" (Statistical Science 7 (1992): 457--472). Thanks to Andy for reminding me (politely!) about these pieces.

Introduction to Statistical Computing

Posted at October 21, 2013 13:58 | permanent link

Three-Toed Sloth