Evaluation Gap update
Greetings from the Center for Global Development! This is the first issue of the Evaluation Gap Update. The Global Health Policy Research Network's initiative on Closing the Evaluation Gap addresses a key problem in global development: we have too few solid answers about "what works" in social development programs. When Will We Ever Learn? Recommendations to Improve Social Development through Enhanced Impact Evaluation suggests solving the problem through the creation of an "Impact Evaluation Club." I'll be sending you the Evaluation Gap Update periodically through the duration of the initiative with updates on our progress and relevant news on impact evaluation.
In this issue:
- Consultation Ends January 31
- World Bank, USAID Studies Urge More and Better Evaluation
- Campbell Colloquium February Event on Producing Systematic Reviews of Evidence
- Additional Resources on the Evaluation Gap
- New Frequently Asked Questions about Practical Application of CGD Proposal
Have you sent us your input yet on the Evaluation Gap proposal? The consultation phase will end on January 31, 2006. Before then, we hope you will take 10 minutes to either complete our survey or comment on our report. Your ideas and opinions are important to us as we will be finalizing the report in February according to feedback we receive during the consultation phase.
World Bank, USAID Studies Urge More and Better Evaluation
In a new World Bank report, Improving the World Bank's Development Effectiveness: What Does Evaluation Show?, the Operations and Evaluation Department uses recent evaluations to assess the development effectiveness of the World Bank and how it could be improved. The report recognizes the need to "throw light on the impact of projects" and indicates that "M&E is so weak that little is known of outcomes."
A USAID study, Evaluation of Recent USAID Evaluation Experience,(pdf) demonstrates a similar gap in evaluation. While recognizing that "USAID's success in making a difference in people's lives…requires learning about what works and what doesn't and applying those lessons," the report identifies constraints:
- "staff do not see any Agency-wide incentive to advance learning through evaluations,"
- "the monitoring/evaluation balance has swung too much toward performance monitoring,"
- "partner organizations are learning from experience; USAID is not," and
- "USAID evaluations employed very weak evaluation methods."
Campbell Colloquium February Event on Producing Systematic Reviews of Evidence
Judith M. Gueron, President Emerita of MDRC and member of the Evaluation Gap Working Group, will be giving the keynote address "Building Evidence: What it takes and what it Yields" at the , 6th International Campbell Collaboration Colloquium February 2006 in Los Angeles, California. Other working group members, William D. Savedoff and David I. Levine, will be joined by Robert Boruch of the Wharton School in a roundtable discussion to review the current state of impact evaluation work in developing countries, considering obstacles to and proposals for improvement.
- Learning What Works—and What Doesn't: Building Learning into the Global Aid Industry (pdf): Evaluation Gap Working Group member David I. Levine explains how learning about effective development projects is inhibited by the structure of the aid process; incentives faced by policymakers; and limited analytical capacities in relevant agencies.
- Throwing Good Money After Bad: A common error misleads foundations and policymakers(pdf): Judith M. Gueron briefly explains a common policymaker trap—being mislead by outcome measures, and how to avoid it with better impact measures.
- Interpreting the evidence: choosing between randomised and non-randomised studies, an article in the BMJ, summarizes the pros and cons of methods to evaluate impact of health programs.
New Frequently Asked Questions about Practical Application of CGD Proposal
In response to common questions about the practical application of the Impact Evaluation Club, we have posted answers to the following new FAQs:
17. Who the evaluation club will serve?
18. Who will engage in the practical application of evaluations?
19. What do we envision the role will be for NGO's?
20. Who will be doing the evaluations?
If you have news or highlights that you would like for us to share in future mailings, please send them to Jessica Gottlieb (email@example.com). If you do not want to continue receiving updates about the Evaluation Gap initiative, please use the unsubscribe function at the bottom of the letter.
Thanks and regards,