Share

This post is joint with Julie Walz.

On September 26, the Office of the Inspector General for USAID issued a blistering evaluation of USAID's activities in Haiti.  The report focuses on implementation of the Haiti Recovery Initiative (HRI) which supports short- and medium-term reconstruction projects. Overall, the audit states that the work is “not on track” and identifies areas for improvement including: monitoring and evaluation, community involvement, technical assistance, and the need for environmental reviews.  These are some of the themes that we also highlighted in our CGD Policy Paper entitled "Haiti: Where Has All the Money Gone?" We proposed three solutions to improving the use of taxpayer dollars in Haiti:

1.       Requiring More and Better Independent Evaluations from NGOs and private contractors.

2.       Publishing Data Through the International Aid Transparency Initiative.

3.       Piloting Competitive Bidding and working with the local private sector.

Apparently, USAID's auditors agree with us, for the most part.  In particular, they single out Chemonics, a privately-owned for-profit contractor that has benefitted from nearly $150 million in Haiti between January 2010 and March 2012, including a $53-million, 18-month contract for the second phase of HRI (known as HRI-II).  As of May 3, 2012, USAID’s Office of Transition Initiatives (OTI) had obligated $46.5 million and disbursed $23 million of this $53 million contract.  This is the second time Chemonics’ work in Haiti has been questioned; a 2010 audit focused on the firm’s cash for work program and found that the contractor did not hire thousands of Haitians as planned.  Chemonics has also been widely criticized for wasting aid in Afghanistan.  In a written response to the audit, OTI said that they agree with all of the recommendations in the report.  At the time of writing, there has been no response from Chemonics.

Lack of monitoring and evaluation

"Chemonics officials said they decided not to prepare a formal [monitoring and evaluation] plan because the activities were so different from each other that monitoring and evaluation plans needed to be tailored to the activity level. At the time of the audit, HRI-II had 141 activities in progress. Because each activity had its own individual monitoring and evaluation plan, Chemonics was tracking 141 different plans, but had no single plan for the entire program. After an activity was completed, the Chemonics staff performed a final evaluation. OTI and Chemonics officials said they would use final evaluations, comparisons of “clusters” of similar activities, and “thematic reviews to provide lessons learned and a feedback loop” to guide future activities." (4)

This is probably the best excuse we have heard for not doing an evaluation.  The report goes on to say that a year and a half after the quake, Chemonics hired one full-time M&E officer to monitor this $53 million contract.  Needless to say, hiring one person was not sufficient to generate any type of meaningful evaluation.

Chemonics’ implementation plans also did not have enough information for OTI to judge whether or not the program was on track:

"… the plan for providing Haiti’s Parliament with temporary offices and meeting space consisted of “The subcontractor will be responsible for the following: 1) Assembling and installation of steel- framed structures; 2) Connection of utilities.” No dates or estimated timelines were included." (5)

How can one evaluate progress when there isn’t even a timeline for completion?

Lack of clearly defined indicators

The report raises another issue that is a crucial first step to evaluations:  defining unambiguous indicators that accurately measure the goals of program, rather than intermediate inputs of expenditures.

“Some of the performance indicators Chemonics developed were not well-defined. One activity provided computer equipment, chalkboards, benches, chairs, desks, and school kits consisting of backpacks and school supplies to two public schools in target communities. Yet a performance indicator for that activity measured the number of students who returned to school. (4)

An engineering study to improve roads in one town was measured by the “number of reconstructed national governing institutions and systems that receive USG assistance to incorporate principles that support democracy and government legitimacy.” (4)

There is no clear connection between point A (the actual project) and point B (the effect Chemonics measured); in fact the second example is simply confusing.  Accurate evaluations cannot be conducted with such a dramatic mismatch between program inputs and intended accomplishments.

Lack of local involvement

We have written before that many Haitians felt excluded from the relief effort and were often not consulted about reconstruction.   Beltway contractors came in with their own supplies and labor, often displacing local firms with similar subcontracting capabilities.  The audit finds this same theme with Chemonics:

“Chemonics used contractors from Port-au-Prince to implement a number of activities in Cap-Haitien and Saint-Marc; these contractors brought their own people to do the jobs instead of hiring locals. As a result, residents saw jobs in their neighborhoods being done by outsiders, and without an understanding of the activities, they did not see how anyone local benefitted.” (6)

Moving Forward

USAID's auditors have provided detailed recommendations for better monitoring and evaluation; it is now up to program staff to implement the findings of the report. And, we still believe publishing data through IATI and piloting competitive bidding would help track the money and progress in Haiti. To do this, USAID will likely need to invest more resources (including staff time) in ensuring that NGOs, private contractors and other recipients of taxpayer funds actually deliver the services they have promised. The people of Haiti and the United States deserve better than what they have been getting so far.