Lecture 16: The Monte Carlo principle for numerical integrals: write your integral as an expectation, take a sample. Examples. Importance sampling: draw from a distribution other than the one you really are want, then weight the sample values. Markov chain Monte Carlo for sampling from a distribution we do not completely know: the Metropolis algorithm. Gibbs sampling. Bayesian inference via MCMC.
Update, 22 December: If you do read Geyer, it's also worth reading two posts by Andrew Gelman (A Centipede Many Times Over and A Tale of Two Discussion Papers), and Gelman and Rubin's "Inference from Iterative Simulation Using Multiple Sequences" (Statistical Science 7 (1992): 457--472). Thanks to Andy for reminding me (politely!) about these pieces.
Posted at October 21, 2013 13:58 | permanent link