Markov chain Monte Carlo

Content for Tuesday, October 17, 2023

Content

  • Efficient proposals for Metropolis–Hastings and tuning
  • Gibbs sampling and data augmentation
  • Bayesian worflow and diagnostics for MCMC

Learning objectives

At the end of the chapter, students should be able to

  • implement MALA and Gibbs sampling
  • derive the conditional distributions of a model for Gibbs sampling
  • diagnose performance of MCMC algorithms and implement potential remedies

Readings

Warning

These readings should be completed before class, to ensure timely understanding and let us discuss the concepts together through various examples and case studies — the strict minimum being the course notes.

Complementary readings

Warning

Complementary readings are additional sources of information that are not required readings, but may be useful substitutes. Sometimes, they go beyond the scope of what we cover and provide more details.

  • Albert (2009), chapters 6 and 10 (several examples)
  • McElreath (2020), chapter 9 (non technical)

Slides

 View all slides in new window  Download PDF of all slides

References

Albert, J. (2009). Bayesian computation with R (2nd ed.). Springer. https://doi.org/10.1007/978-0-387-92298-0
Geyer, C. J. (2011). Introduction to Markov chain Monte Carlo. In Handbook of Markov chain Monte Carlo. CRC Press. https://doi.org/10.1201/b10905-3
McElreath, R. (2020). Statistical rethinking: A Bayesian course with examples in R and STAN (2nd ed.). Chapman; Hall/CRC.