MCC Turns Research into Learning

In its 15 years of operation, the Millennium Challenge Corporation (MCC) has, in many ways, set a new standard for evaluation and learning. Its commitment to conducting rigorous, independent evaluations for the vast majority of its funded activities is central to its results-focused model.

But conducting an evaluation is only the first step. Evaluations only have value if they’re used. While money spent on a development program may or may not be well spent—and evaluation can help tell you if it is—money spent on an evaluation that’s not used to inform future programming (or evaluation practice) is essentially wasted.

MCC is taking steps to make sure its significant investment in evaluation is money well spent. As MCC’s evaluation portfolio has grown to include nearly 200 studies, the agency has turned its attention to ensuring that its evaluation findings are understood and applied, both within MCC (program folks, leadership) and among relevant external stakeholders (congressional staff, host country governments and other local actors, the US-based development policy community). MCC recognized that for learning to take place, it had to go beyond its long-term commitment to simply publishing evaluations online. The studies—including their summaries—could be hard to find and their length and technical language made them inaccessible to nontechnical and/or time constrained audiences (i.e., almost all program staff and policymakers).

This week, MCC launched the first of a new series of Evaluation Briefs, a product line that will go a long way toward making evaluation findings more accessible. Each brief is just four pages long with key findings up front. And they’re not the standard puff piece/success story fare we’re all used to seeing from US aid agencies. In fact, the production process both starts and ends with the independent evaluator to ensure fidelity to the evaluation’s findings. The result is that the briefs are refreshingly frank and up front not only about the results that were achieved but also where programs fell short. This is the stuff that’s important for learning.

Of course, it’s always challenging to create a single product that will speak to the diverse needs of all the various nontechnical audiences MCC might want to reach. But the way the new briefs capture and convey key messages that might otherwise be lost becomes an important entry point for further, more tailored conversations. One could also envision value in going beyond summaries of individual studies to create sector-wide syntheses of multiple evaluation findings. MCC has done this in a limited way with some of its Principles Into Practice papers. Hopefully the production of Evaluation Briefs will open the door for more of these.

This effort is just getting started. So far MCC has released 11 briefs—looking at electricity in Malawi, girls’ education in Burkina Faso, water supply in Tanzania, and policy reform in Honduras, among others. But it has committed to creating them for its 150+ remaining evaluations—and future evaluations, too. MCC deserves a lot of credit for this new effort. Once again, it demonstrates that a commitment to results and learning are at the core of its mandate and model.


CGD blog posts reflect the views of the authors, drawing on prior research and experience in their areas of expertise. CGD is a nonpartisan, independent organization and does not take institutional positions.

Image credit for social media/web: Wikimedia Commons