Professional Documents
Culture Documents
Syllabus
Administration
• Rachel Glennerster, Executive Director, Abdul Latif Jameel Poverty Action Lab (JPAL)
• Marc Shotland, Associate Director of Training, Abdul Latif Jameel Poverty Action Lab
(JPAL)
• Course website: https://www.edx.org/course/designing-running-randomized-evaluations-
mitx-jpal102x
Course Description:
A randomized evaluation, also known as a randomized controlled trial (RCT), field experiment
or field trial, is a type of impact evaluation that uses random assignment to allocate
resources, run programs, or apply policies as part of the study design. This course will provide
step-by-step training on how to design and conduct an RCT for social programs. You will learn
about why and when to conduct RCTs and the key components of a well-designed RCT. In
addition, this course will provide insights on how to implement your RCT in the field, including
questionnaire design, piloting, quality control, data collection and management. The course will
also go over common practices to ensure research transparency.
Students who are taking this class in pursuit of the MicroMasters credential will also have to
take an in-person, proctored exam.
To be eligible to register for the proctored exam, you first have to pass the online component of
this class on edX. Your final MicroMasters course grade will be calculated as follows:
- edX course: 40%
- Proctored exam: 60%
Academic Honesty:
We take academic honesty very seriously. You must complete all graded materials by yourself
and not engage in any activity that would dishonestly improve your results, or improve or hurt
the results of others. You may use only one user account and not let anyone else use your
username and/or password. Having two user accounts registered in this course constitutes
cheating. We will be monitoring this diligently. Should we become aware of any suspicious
activity, we reserve the right to remove course credit, not award a MicroMasters certificate,
revoke a MicroMasters certificate, ban you from future MITx Economics classes, and exclude
you from consideration for admission to the MIT blended Master’s program in Data, Economics,
and Development Policy without warning.
2
test your understanding of the material. You will also have to complete a minimum of one
assignment per week (case study, exercise or a set of quiz questions). You will have access to the
presentation slides to follow along during the lecture. The minimum commitment will be
approximately 8-12 hours per week for watching the lectures, doing the readings, and completing
the assignments.
3
Week 4: February 27, 2017
7. Sampling and Randomization
• Lecture sequence
• Problem set: Quiz questions
• Optional Reading: The Power of Survey Design, Chapter 4
8. Sample size and Power
• Lecture sequence
• Reading: Running Randomized Evaluations, Chapter 6
• Problem set: Quiz questions
9. Practical tips: Sampling and Sample size
• Lecture sequence
• Readings: Berry et al (2016); Bruhn, McKenzie (2008); Crepon et al. (2014)
• Problem set: ‘Calculating Power using Statistical Software’ Exercise and Quiz
questions
• Optional Reading: Imbens (2011)
4
• Readings: McKenzie and Woodruff (2016); Woodruff et al. (2007)
• Problem set: Quiz questions
15. Measuring Networks
• Lecture sequence
• Problem set: Quiz questions
5
• Problem set: Quiz questions
24. Collecting high quality data: Accurate data
• Lecture sequence
• Problem set: ‘Back checks and High frequency checks using Statistical Software’
Exercise
6
References
Alderman, H., J. Das and V.Rao. 2013. Conducting Ethical Economic Research: Complications
from the Field. World Bank Policy Research Working Paper 6446, doi: 10.1596/1813-9450-6446
Berry, Jim, Marc Shotland and Rukmini Banerji. 2016. The Impact of Maternal Literacy and
Participation Programs: Evidence from a Randomized Evaluation in India. American Economic
Journal: Applied Economics, forthcoming
Bruh, Mirium and David McKenze. 2008. In Pursuit of Balance: Randomization in Practice in
Development Economics. © World Bank. http://elibrary.worldbank.org/doi/abs/10.1596/1813-
9450-4752 License: CC: BY 3.0 IGO
Crepon, Bruno., Florencia Devoto, Esther Duflo And William Pariente, 2014. Estimating the
impact of microcredit on those who take it up: Evidence from a randomized experiment in
Morocco. American Economic Journal: Applied Economics 7(1): 123–150
Das, Jishnu and Zajonc, Tristan. 2008. India Shining and Bharat Drowning : Comparing Two
Indian States to the Worldwide Distribution in Mathematics Achievement. Policy Research
Working Paper No. 4644. World Bank, Washington, DC. © World Bank.
https://openknowledge.worldbank.org/handle/10986/6668 License: CC BY 3.0 Unported
Deaton, Angus and Salman Zaidi. 2002. Guidelines for Constructing Consumption Aggregates
for Welfare Analysis. LSMS Working Paper;No. 135. World Bank. © World Bank.
https://openknowledge.worldbank.org/handle/10986/14101 License: CC BY 3.0 IGO.”
Dhaliwal, Iqbal, Esther Duflo, Rachel Glennerster, Caitlin Tulloch. 2012. Comparative Cost-
Effectiveness Analysis to Inform Policy in Developing Countries: A General Framework with
Applications for Education; Abdul Latif Jameel Poverty Action Lab, MIT
Feeney, Laura, Jason Bauman, Julia Chabrier, Geeti Mehra, Michelle Woodford. 2015. Using
Administrative Data for Randomized Evaluations, JPAL North America
Gentzkow, Matthew and Jesse M. Shapiro. 2014. Code and Data for the Social Sciences: A
Practitioner’s Guide. University of Chicago, mimeo
Gertler, Paul J., Sebastian Martinez, Patrick Premand, Laura B. Rawlings, and Christel M. J.
Vermeersch. 2016. Impact Evaluation in Practice, second edition. Washington, DC: Inter-
American Development Bank and World Bank. doi:10.1596/978-1-4648-0779-4
7
Glennerster, Rachel. 2016 The Practicalities of Running Randomized Evaluations: Partnerships,
Measurement, Ethics, and Transparency. In A. V. Banerjee and E. Duflo (Eds.), Handbook of
Field Experiment, forthcoming
Gneezy, Uri and Alex Imas. 2016. Lab in the Field: Measuring Preferences in the Wild. In A. V.
Banerjee and E. Duflo (Eds.), Handbook of Field Experiment, forthcoming
Grosh, Margaret and Paul Glewwe. 2000. Designing Household Survey Questionnaires for
Developing Countries: Lessons from 15 Years of the Living Standards Measurement Study.
Volumes 1, 2, and 3. The World Bank.
Imbens, Guido. 2011. Experimental Design for Unit and Cluster Randomized Trials.
International Initiative of Impact Evaluations (3ie). Washington, DC, http:// cyrussamii.com/ wp-
content/ uploads/ 2011/ 06/ Imbens_June_8_paper.pdf.
Larossi, Giuseppe. 2006. The Power of Survey Design: A User's Guide for Managing Surveys,
Interpreting Results, and Influencing Respondents. Washington, DC: World Bank. © World
Bank. https://openknowledge.worldbank.org/handle/10986/6975 License: CC BY 3.0 IGO
National Academy of Sciences. 2000. Biological and clinical data collection in population
surveys in less developed countries. Summary of a meeting held by MEASURE Evaluation
January 24-25, 2000
8
Olken, Benjamin A.2016. Promises and Perils of Pre-Analysis Plans. Journal of Economic
Perspectives 29(3): 61–80
Pollock, H, E. Chuang and S. Wykstra. 2015. Best Practices for Data and Code Management,
Innovations for Poverty Action
UNESCO Institute for Statistics (UIS) (2016). Understanding What Works in Oral Reading
Assessments: Recommendations from Donors, Implementers and Practitioners. Montreal:
UNESCO Institute for Statistics
Woodruff, Christopher, David McKenzie and Suresh de Mel. 2007. Measuring Microenterprise
Profits: Don’t ask how the sausage is made World Bank Policy Research Working Paper 4229,
doi: 10.1596/1813-9450-4229