Expectation propagation

Content

  • Newton method and sequences of Gaussian approximations
  • Expectation propagation (EP)

Learning objectives

At the end of the chapter, students should be able to

  • implement expectation propagation in simple problems
  • explain the differences between EP, Laplace and variational inference
  • understand under which circumstances EP is unstable

Readings

  • Dehaene & Barthelmé (2018), section 1–2
  • Section 10.7 of Bishop (2006)
  • Cseke & Heskes (2011)

Complementary readings

Slides

 View all slides in new window  Download PDF of all slides

Code

  • R script of expectation propagation for logistic regression

References

Bishop, C. M. (2006). Pattern recognition and machine learning (information science and statistics). Springer-Verlag.
Cseke, B., & Heskes, T. (2011). Approximate marginals in latent Gaussian models. Journal of Machine Learning Research, 12(13), 417–454. http://jmlr.org/papers/v12/cseke11a.html
Dehaene, G., & Barthelmé, S. (2018). Expectation propagation in the large data limit. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 80(1), 199–217. https://doi.org/10.1111/rssb.12241
Minka, T. P. (2001). A family of algorithms for approximate Bayesian inference [PhD thesis, Massachusetts Institute of Technology]. http://hdl.handle.net/1721.1/86583