Math 459: Bayesian Statistics
Spring 2016

Instructor: Todd Kuffner (

Grader: Wei Wang (

Lecture: 11:30-1:00pm, Tuesday and Thursday, Psychology 249

Office Hours: Monday 3:00-4:00pm, Tuesday/Thursday 1:05-2:00pm in Room 18, Cupples I

Course Overview: This course introduces Bayesian statistical theory and practice. The material will be presented at a level suitable for advanced undergraduate and master's degree students. Topics include: foundations and principles of Bayesian inference, comparisons with frequentist procedures, prior specification, selected computational methods (Markov Chain Monte Carlo), empirical Bayes, Bayesian linear regression and Bayesian model selection. Time permitting, additional topics may be selected by the instructor, such as approximate Bayesian computation or Bayesian nonparametric inference. Emphasis will be given to applications using R.

Prerequisite: It is assumed that students are already familiar with probability at the level of Math 493, and have learned the core concepts of statistical inference. The latter requirement is ideally satisfied by Math 494, but other courses are acceptable. Familiarity with R is essential. A course in computer programming would be helpful. Knowledge of multivariate calculus and linear algebra at the level of Math 233 and Math 309, respectively, is assumed.

Piazza: Make sure to enroll in this course on Piazza.

Textbook: You are encouraged, but not required, to obtain a copy of:

Computing: Familiarity with R is required. You can find many tutorials by clicking here. On the left side under Documentation, select Contributed to see a list of tutorials. A list of Bayesian packages here.
Grades: 30% Homework, 20% for each Midterm, 30% Final

Exams: 2 midterms and 1 final.

Homework: The lowest homework grade will be dropped. Assignments will include applied (using R), theoretical and philosophical (essay-based) problems. I expect to assign roughly one homework for every 3 lectures.  Homework is due at the beginning of class on the specified due date. All homework must be submitted to the instructor to receive credit; homework submitted to the grader will not be accepted without prior approval. See the policy on late homework below.

Final Course Grade: The letter grades for the course will be determined according to the following numerical grades on a 0-100 scale.
[98, 100]
[87, 90)
[77, 80)
[67, 70)
[93, 98)
[83, 87)
[73, 77)
[63, 67)

[90, 93)
[80, 83)
[70, 73)
[60, 63)

Course Schedule: Future topics tentative and subject to change; will be updated to reflect actual topics covered.
Week 1
Theme: What is Bayesian Inference?
Principles and Examples of Bayesian Methods;
Review of MLE; Bayesian estimation for scalar parameter models
R examples: binomial and exponential with conjugate priors
Week 2
Theme: What is Bayesian inference?
MLE and Bayesian estimation for vector-parameter models; likelihood inference; marginal posterior; interval estimates: credible sets

Week 3
Theme: Why Bayesian inference?
Philosophy of science; falsifiability; inductive reasoning; interpretations of probability; current controversies (Ioannidis on why most published research findings are false, the backlash against null-hypothesis significance testing)

Decision theory; components of statistical decision problems; risk; loss functions; criteria for optimal decision rules; admissibility, minimaxity, unbiasedness, Bayes risk
Week 4
Theme: Why Bayesian inference?
Deriving Bayes rules from principles of decision theory; Bayes and admissibility/minimaxity; least favorable priors; HPD intervals; Stein's paradox; James-Stein estimation; empirical Bayes interpretation
Week 5
Theme: Specifying the prior.
Conjugacy; objective Bayes; empirical Bayes; invariance and Jeffreys prior; Kullback-Leibler divergence and the reference prior; probability matching priors; background on Fisher information, orthogonality, prior and posterior independence, exchangeability
Week 6
Theme: Asymptotic Analysis
Review of deterministic concepts; stochastic convergence; stochastic orders of magnitude; Khintchine's WLLN and Kolmogorov's SLLN; classical CLT; continuous mapping, Levy continuity theorems; uniform convergence; Polya's theorem; Berry-Esseen theorem; Scheffe's lemma

Midterm Exam 1
Week 7
Theme: Large Sample Properties for Parametric Bayes
Review of likelihood asymptotics; multivariate normal distribution; posterior consistency; Bernstein-von Mises theorem; sufficiency, conditionality and likelihood principles
Week 8
Theme: Motivations & Tools for Approximate Bayesian Inference
Liouville's theorem with bits of complex analysis (holomorphic, meromorphic functions, special functions); Risch algorithm; Monte Carlo methods; random number generation; importance/rejection sampling
Week 9
Spring Break
Week 10
Theme: Computation
Gibbs sampling, Metropolis-Hastings; reversible jump; convergence diagnostics
Week 11
Theme: Linear regression.
Prior specification, estimation, inference, model diagnostics and model comparison
Week 12
Theme: Model Comparison and Hypothesis Testing
Bayes factors; Laplace approximation; MCMC estimation of marginal likelihood; relationship to classical approaches

Midterm Exam 2
Week 13
Theme: Generalized linear models.
Review of posterior predictive distributions; generalized linear models; Bayesian binomial and Poisson regression in rstan
Week 14
Theme: Hierarchical linear models.
Hierarchical linear models; empirical Bayes connections; random and mixed effects models
Week 15
Theme: Bayesian nonparametric models.
Concepts; prior specification; Dirichlet process; stick-breaking representation
Reading Period
Office Hours by appointment.
Final Exam
See Piazza for details.

Other Course Policies: Students are encouraged to look at the Faculty of Arts & Sciences policies.