May 2012
Dear Colleague,
Good methodologies and careful research are critical to generating credible evidence from impact evaluations. This month’s update addresses other factors which are essential complements for translating that evidence into action. We have mixed progress in Latin America on promoting the use of impact evaluations for policy, a guide on low-cost evaluations, and a video that makes a complicated evaluation technique look easy.
In preparing this update, I realized that this series is in its seventh year. If you’d like to find out how it all got started, check out the Evaluation Gap Initiative’s site. Or browse through our archive of Evaluation Gap Updates to see the items we’ve highlighted over the years for being insightful, useful, or simply amusing.
Regards,
William D. Savedoff
Senior Fellow
Center for Global Development
|
Institutionalizing evidence for policy: bad news, good news |
First, the bad news. In 2010, we reported that Argentinean representatives led by Eduardo Amadeo had introduced legislation to institutionalize impact evaluations of public programs. This month, we learned that Argentina’s congressional leadership has officially dropped the proposed legislation, though Amadeo says he plans to continue his efforts. Good news, though, from the Pacific coast. Two Peruvian ministries have created the Comisión Quipu – an organization charged with improving the empirical basis for public policies. The commission includes representatives from the government, domestic universities and international experts. It will get technical assistance from J-PAL, Innovations for Poverty Action and Soluciones Empresariales Contra la Pobreza (SEP). According to J-PAL’s announcement, the commission is named for the quipu which the Incas used to measure and track their empire’s finances and demographics.
Image: Ministro de Economia, Peru |
|
|
How much does a good evaluation cost? |
The cost of rigorous evaluations depends on many things including the question being studied, the context, and the required level of precision. So why do people complain about the cost of studies as a general principle when they vary so much? And why discuss costs without considering the benefits of the information they generate? The Coalition for Evidence-Based Policy contributes some evidence about costs to this debate in “Rigorous Program Evaluations on a Budget: How Low-Cost Randomized Controlled Trials Are Possible in Many Areas of Social Policy.” Their brief guide describes five well-conducted, low-cost studies which ranged from $50,000 to $300,000. The introduction of random assignment in these studies comprised only a small portion of this cost (between $0 and $20,000) and the studies all produced practical and useful evidence for public policy.
| |
Explaining rigorous evaluations well is part of the challenge … |
… but this video from the International Growth Centre shows it can be done. In the video, Karthik Muralidharan (University of California, San Diego) and Nishith Prakash (University of Connecticut) explain how they measured the impact of a program in Bihar, India that gave bicycles to girls as a way to promote increases in high school enrolment. Though the study results are preliminary, the method seems robust. Muralidharan and Prakash control for other factors by contrasting the change in enrolment for girls over time to the change in enrolment for boys within Bihar. They then go one step further by contrasting that difference with the comparison between girls and boys in a neighboring state that did not have the bicycle program. Smart research design; excellent explanation of results.
Image: Flickr user Avram Iancu/ CC |
|
|
| Resources |
- A call for proposals to conduct impact evaluations has been issued by the World Bank on a range of topics in health, sanitation, and education. Due June 4, 2012.
- “Evidence from the One Laptop per Child Program” in Peru shows a substantial increase in access to laptops but limited impact on enrolment and learning.
- Jim Manzi’s book Uncontrolled: The Surprising Payoff of Trial-and-Error for Business, Politics, and Society calls for governments to do more experimentation and evaluation. It inspired comments from David Brooks of the New York Times and a blog by Markus Goldstein at the World Bank.
- Online access to J-PAL's 2011 Executive Education Course is now available from MIT Open Courseware, including all lecture notes, exercises, and case studies.
- David Roodman lists new studies about the impact of microfinance on poverty that have been completed since he finished his book on the topic, Due Diligence.
- Melissa Kelly shares her insights on the challenges of implementing an impact evaluation from an early childhood education program in Mozambique involving Save the Children.
- In “How much do our impacts cost?” Alaka Holla notes that policy decisions require information on the costs as well as benefits of public programs, yet the quality of cost data is often neglected.
|
|