Ideas to Action:

Independent research for global prosperity

Publications

 

August 7, 2013

Context Matters for Size: Why External Validity Claims and Development Practice Don't Mix - Working Paper 336

In this paper we examine how policymakers and practitioners should interpret the impact evaluation literature when presented with conflicting experimental and non-experimental estimates of the same intervention across varying contexts. We show three things. First, as is well known, non-experimental estimates of a treatment effect comprise a causal treatment effect and a bias term due to endogenous selection into treatment. When non-experimental estimates vary across contexts any claim for external validity of an experimental result must make the assumption that (a) treatment effects are constant across contexts, while (b) selection processes vary across contexts. This assumption is rarely stated or defended in systematic reviews of evidence. Second, as an illustration of these issues, we examine two thoroughly researched literatures in the economics of education—class size effects and gains from private schooling—which provide experimental and non-experimental estimates of causal effects from the same context and across multiple contexts.

April 18, 2011

Toward Results-Based Social Policy Design and Implementation - Working Paper 249

This paper analyzes some of the elements that cause the perception in the realm of social policy that too little evidence is produced and used on the impact of specific policies and programs on human development. They propose we develop Results-Based Social Policy Design and Implementation systems that focus public attention on better outcomes.

Miguel Székely
August 21, 2008

A Little Less Talk: Six Steps to Get Some Action from the Accra Agenda

In September 2008 official aid donors and recipients will meet in Accra, Ghana, to discuss how to make development assistance more effective. CGD president Nancy Birdsall and co-author Kate Vyborny suggest that advocates of better aid who really want a win at Accra forget haggling over broad conceptual issues and focus instead on getting a public commitment from donors to one or more very concrete steps to improve aid effectiveness and to hold donors accountable.

Kate Vyborny
May 31, 2006

Learning from Development: the Case for an International Council to Catalyze Independent Impact Evaluations of Social Sector Interventions

This brief outlines the problems that inhibit learning in social development programs, describes the characteristics of a collective international solution, and shows how the international community can accelerate progress by learning what works in social policy. It draws heavily on the work of CGD's Evaluation Gap Working Group and a year-long process of consultation with policymakers, social program managers, and evaluation experts around the world.

May 31, 2006

When Will We Ever Learn? Improving Lives Through Impact Evaluation

Each year billions of dollars are spent on thousands of programs to improve health, education and other social sector outcomes in the developing world. But very few programs benefit from studies that could determine whether or not they actually made a difference. This absence of evidence is an urgent problem: it not only wastes money but denies poor people crucial support to improve their lives.

The Evaluation Gap Working Group