Ideas to Action:

Independent research for global prosperity

X

Global Health Policy Blog

Feed

At the end of 2018, the world had seven development impact bonds (DIBs) and more in the pipeline. Yet questions remain about the potential of DIBs—still a new financing instrument—compared to other pay-for-performance arrangements. With a dearth of standard and transparent evaluation methods, it is difficult to reliably assess whether DIBs are good value for money, and if they are, how best to structure them to maximize their efficiency.

In a recent paper, we collated publicly available information on the world’s only seven active DIBs at the time of publishing and focused on the implementation processes for the three with targeted health outcomes (see figure below). While these three DIBs aim to impact the health of at least 31,600 people with a total of $25.2 million in upfront funding and another $38.1 million in outcome funding, publicly available information on their estimated impact and value for money is lacking, as we discuss in our paper.

Health DIBs: Some key characteristics

DIB Launch Date Years of Funding Intervention Outcomes Upfront Funding Outcome Funding (Max)
Utkrisht Impact Bond May 2018 3 Support improved quality of maternal and newborn care. Health facilities reach accredited quality standards. US$3.5M US$8M
Humanitarian Impact Bond Sept 2017 5 Provide physical rehabilitation services. Construct new facilities in Mali, Nigeria, and DRC; improve ratio of staff to persons regaining mobility. US$19.7M US$27.6M
Cameroon Cataract Bond Jan 2018 5 Supported improved access to eye surgery at a new hospital. Provide high quality and sustainable eye surgeries, particularly for the poorest patients. US$2M US$2.5M

Following the publication of our paper, we received feedback from colleagues at the UBS Optimum Foundation (UBS-OF) highlighting discrepancies in the Utkrisht case study included in our report. The information we originally used for our paper came from this Covergence, Palladium, and Bertha Centre case study, in line with our methodology of using publicly available secondary sources of information. The case study used information from interviews with DIB stakeholders, planning documents, and case studies—parts of which are now out of date because of a later-than-expected launch of the DIB. We have chosen to update our paper based on information that came through our communications with UBS-OF—sourced from final contracts between stakeholders that would only be obtainable through a freedom of information request—because we believe it is important to share this pertinent information.

In summary, UBS-OF is an independent, nonprofit grant-making foundation—founded by but separate from UBS Group AG—that committed the upfront funding of up to $3.5 million with an internal rate of return (IRR) capped at 8 percent for the Utkrisht Impact Bond. Implementation costs, separate from the upfront funding, were budgeted at $6.2 million. Any surplus return above the estimated 8 percent IRR paid to UBS-OF will be distributed among the DIB’s two implementation partners (HLFPPT and PSI) and its management partner, Palladium. That surplus is capped at 15 percent of the $6.2 million in implementation costs and is paid as an incentive in exchange for the implementation partners receiving 80 percent of the budgeted costs of the program during the program, with the remaining 20 percent paid at the end. All the funds that UBS-OF receives will be recycled in the future to support children’s health and education, but they do not use an “evergreen fund.”

Building a common understanding of DIBs

Uncovering some of these issues and discrepancies again highlighted to us the lack of available comprehensive information about DIBs—and also drives home our paper’s recommendations. DIBs, like other results-based financing mechanisms, aim to align development funding with improved outcomes, but also to increase the accountability of development spending. While a given DIB links outcomes directly with funding for a single project, much more could be done to develop a common understanding of how effective DIBs are, or can be, at achieving these aims more broadly.

A simple starting point would be standardization of terminology and processes—an issue that arose in some of the discrepancies we discussed, but one we did not get to in our policy paper. Progress is being made thanks to the Impact Bonds Working Group, which is developing a series of recommendations for the design of future DIBs. One of the working group’s subgroups is focused on an Impact Bond Toolkit that would “mainstream terminology, standardize business processes, provide templates for contracts, design memos, investor reports, due diligence, and best practice principals for Impact Bond procurement and calculation of risk/return,” which would be a commendable first step in understanding and comparing DIBs. The GO Lab also recently launched a consortium to develop a global knowledge platform for impact bonds.

All of this underscores the relevance of the recommendations in our paper, notably:

Create and use consistent reporting and evaluation guidelines and evaluation methods. Three questions in particular should be addressed in the design, delivery, and post hoc assessment of a DIB:

  1. Is the intervention made possible through/incentivised by the DIB work, and is it good value for money in the given setting? So, what does the evidence base tell us about the comparative cost effectiveness of the intervention the DIB is designed to deliver?

  2. Was the DIB successful? In other words, did the DIB do what it set out to do, i.e., pay out when results are delivered, or not, when results are not delivered? So, were the outcome targets selected so they measure what they ought to, did the measurement happen, and was appropriate action taken?

  3. Is the DIB the best policy option given the circumstances, rather than a concessional loan, a grant, or no intervention at all? So, is the DIB good value for money compared to other results-based financing structures?

There is currently no collectively agreed approach to answering these questions, each of which would require different evaluation designs. While DIBs by design require independent ex-post evaluation to verify whether they achieved the target outcomes (question 2), they do not require commissioning independent research or other ex-post means of impact evaluation (question 1) or ex-ante assessments of incentivised activities (question 1). DIBs could benefit from the development of standardized manuals and checklists that build on existing guidance for evaluating feasibility and value for money at these different stages (such as the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) Checklist and the International Decision Support Initiative (iDSI) Reference Case for Economic Evaluation), which could be useful for questions 1 and 3, counting additionally for DIB-specific elements such as outcome-based payments, transaction costs, and investor returns.

Linked to the above, allocate funding to evaluate impact and value for money to build an evidence base for what works and what doesn’t work (see CGD papers from 2013 and 2018). Anecdotal evidence suggests a general aversion to spending even more money on a DIB for evaluations, which are often perceived as too expensive. However, evaluations of the design and implementation of DIBs can be sensible in terms of cost and scope, and take into account constrained resources. Our paper recommends including project evaluations in DIB contracts or commissioning independent reviews of DIBs as two ways to explicitly direct funding to impact and value-for-money evaluation.

Publish planning and evaluation documents. It is difficult to take stock and draw conclusions about DIBs because of poor publication rates of relevant information. There is thus a need for a commitment from DIB investors, service providers/implementing partners, and outcome funders to share more comprehensive information that explains why and how a DIB is used. While there are several project databases for impact bonds, data are often incomplete, have uncertain update schedules, and are not straightforward to export for analysis. If more comprehensive information were to be made publicly available—akin to the social impact bonds and pay for success databases in the UK and US, respectively—it could immensely assist stakeholders in the design and implementation processes for future DIBs.

Without collective understanding of the potential advantages and pitfalls of DIBs, new DIBs are at risk of not being optimally designed, and therefore also not being good value for money.

For further details on our work and recommendations, see here for the full policy paper.

Disclaimer

CGD blog posts reflect the views of the authors drawing on prior research and experience in their areas of expertise. CGD does not take institutional positions.