Intermediate Bayesian Modeling builds on the material from STATS 477 / 577 by providing a deeper exploration of Bayesian inference and Monte Carlo methods. This course includes proofs as well as more detailed mathematical treatments of key Bayesian results. The Metropolis-Hastings algorithm is introduced and students are guided to program their own sampling algorithms.
Room: Science & Math Learning Center, Rm. 356
Time: Tuesday & Thursday, 2:00pm – 3:15pm
Prerequisites: STATS 477 / 577 – Introduction to Bayesian Modeling
Syllabus: fa19syllabus.pdf
The material here is derived from a series of lectures De Finetti gave at the Henri Poincaré Institute (IHP) in 1935. De Finetti's Theorem is expounded and proven in Chapter 3, with further discussion of exchangeability continuing in Chapters 4 and 5. Earlier, in Chapters 1 and 2, De Finetti gives his own development of the notion of probablity—which makes for entertaining reading, but is not of critical importance for this class. Chapter 6 provides a philosophical summation of De Finetti's perspective and how this work fits into that perspective.
Chapter 3 of my dissertation included a review of many of the model selection tools we've discussed in class, including a proof for the asymptotic equivalence of DIC and AIC under certain conditions. If you're looking for a good textual review of what I've done in lecture, this should work well.
UIowa's Joe Cavanaugh is an expert on model selection and information criteria. He is particularly good at distilling difficult mathematical arguments into easy-to-follow derivations. In the above papers, he lays out the detailed theoretical justifications for AIC and BIC in a way that should be understandable to well prepared students of statistics.
Celeux et al. (2006) provide an in-depth discussion of how DIC can be operationalized for missing data models (including mixed models). Aside from its application to DIC, this is a good read purely for how it asks the reader to think harder about the deeper structure of missing data models.
Chi Feng, a research assistant at MIT's computational design laboratory, created an interactive gallery for visualizing different Monte Carlo sampling algorithms. Many of the algorithms demonstrated here are ones we haven't talked about in class, but this can help us visualize the behavior of the Metropolis algorithm as well as Hamiltonian Monte Carlo.
One of my Bayes students, Mustafa Salman, forked Chi Feng's original code and added a rejection rate display in the top-left corner. This allows us to view the same demonstrations, but to also better understand how modifying the tuning parameters affects the rejection rates. Good for understanding optimality issues. (GitHub code available here)
Homework 1 (tex) – Due 26 September (Solutions available)
Homework 2 (tex) – Due 17 October (Solutions available)
Homework 3 (tex) – Due 7 November (Solutions available)
Homework 4 (tex) – Not Assigned (Solutions available)
Instructions (tex) – Due 13 December by 12pm (noon)
citations.csv – Data file for final project
Office: SMLC 328
Fall 2019 Office Hours: