Impact Evaluation (PPHS 617, Winter 2020)

Co-taught with Arijit Nandi


This course will cover methods for estimating the effects of social interventions on health outcomes. We will introduce and provide the intuition for conducting impact evaluation studies in public and population health and discuss recent developments. We will define causal policy effects within the potential outcomes framework and contrast methods for describing the association between an intervention and health outcome with methods for estimating causal effects. We will introduce and formally define policy-relevant research questions based on specific causal contrasts. The course format will be a mixture of didactic lectures, classroom evaluations of published evaluations, homework assignments, and student presentations. In the lecture component of the class, we will introduce approaches for estimating the effects of social interventions, classified broadly as experimental or observational (quasi-experimental). The section on experimental research will cover the use of randomized trials and cluster randomized trials for impact evaluation. The section on observational designs will describe quasi-experimental techniques, including interrupted time series, difference-in-differences, instrumental variables, synthetic control, and regression discontinuity designs.

Reference text

There is no required text. Readings will be assigned. Several readings will be assigned from Impact Evaluation in Practice (2nd Edition), which is downloadable from:

Course format

The course will be a combination of didactic lectures, practicums, presentations, and roundtable sessions. The reading load will be substantial, as we are trying to expose you to a wide variety of the diverse set of conceptual and methodological issues relevant for impact evaluation.


Attendance/participation: 20%

Course attendance is mandatory. Your grade will be based on participation in general class activities, practicums, discussions of readings, and debate. Part of your participation will depend on your preparation. This means showing up for each class having completed the assigned readings; this is especially true for practicum sessions—it is expected that students will begin the practicum assignments, including any readings, in advance of the session. In class, participation means asking questions about anything in the readings, lectures or discussion that seems unclear or objectionable, offering arguments and responses, respectfully listening to the arguments and responses of others, and generally being a good citizen.

Student protocol and presentation: 50% (40% for written protocol and 10% for presentation)

Over the course of the term students will work in groups of 2-3 students to develop a protocol for a pragmatic intervention study and evaluation. For this project you will work with your group to develop an impact evaluation study designed to estimate the effect of a public policy intervention on a health-related outcome. You should draw on the course material and apply one of the evaluation methods covered. The design of your evaluation should be original, meaning the intervention or the evaluation method must be novel; you cannot replicate an existing impact evaluation (simply changing the outcomes in an existing evaluation study is not sufficient). For the written component of the protocol, each group will submit an evaluation protocol using guidelines that will be provided. Only one submission is allowed per group and we expect partners to contribute equally to the project. Each student will also participate in a short group presentation of their protocol, and will answer questions from their peers and the instructors. Students will be evaluated on their ability to demonstrate mastery of the core conceptual and methodological material presented in class, and on their ability to communicate their ideas and respond to critiques of their work.

Homework assignments: 30%

Advanced statistical techniques and their assumptions can become clearer with hands-on experience and practice. We aim to provide experience and practice by giving two homework assignments during the term that will involve data analysis and interpretation of results. You can use any software package to complete the assignments. We will provide technical support as best we can. You are allowed to work in groups of up to three, with one submission per group.

Detailed course outline

Readings and additional resources for each session will be posted on the course website:


6 Jan: Lecture 1. Why evaluate? (Nandi and Harper)
8 Jan: Lecture 2. Counterfactuals and causal policy effects (Nandi)

Module 1. Experimental methods for impact evaluation

13 Jan: Lecture 3. Introduction to Randomized Controlled Trials (RCTs) (Harper)
15 Jan: Lecture 4. Randomization (Nandi)
20 Jan: Lecture 5. Cluster RCTs (Nandi)
22 Jan: Lecture 6. Study procedures (Nandi)
27 Jan: Practicum 1. Design of RCTs
29 Jan: Lecture 7. Analysis of RCTs (Harper)
3 Feb: Practicum 2. Analysis of RCTs *Homework 1 assigned
5 Feb: Lecture 8. Threats to validity and limitations of RCTs (Harper)

Module 2. Quasi-experimental methods for impact evaluation

10 Feb: Research-to-policy roundtable
12 Feb: Lecture 9. Introduction to quasi-experimental designs (Harper)
17 Feb: Lecture 10. Pre-post and interrupted time series designs (Nandi)
19 Feb: Practicum 3. Pre-post and interrupted time series designs
24 Feb: Lecture 11. Difference-in-differences (Harper)
26 Feb: Practicum 4. Difference-in-differences
2 Mar: No class (Winter break)
4 Mar: No class (Winter break)
9 Mar: Lecture 12. Synthetic controls (Nandi)
11 Mar: Practicum 5. Synthetic controls
16 Mar: Lecture 13. Instrumental variables (Harper)
18 Mar: Practicum 6. Instrumental variables
23 Mar: Lecture 14. Regression discontinuity (Nandi)
25 Mar: Practicum 7. Regression discontinuity
30 Mar: Lecture 15. Reporting, research dissemination, and recap (Harper)


1 Apr: Presentation of student protocols (I)
6 Apr: Presentation of student protocols (II)
8 Apr: Presentation of student protocols (III)