BLOG POST

Replication Repeatedly Repressed

October 15, 2010
As you may know, I trained as a mathematician, not an economist. Almost all of my serious work in economics has been that of an annoyingly demanding consumer trying to decide which studies to believe. That has led me to replicate influential studies of the impact of foreign aid, financial system growth, and microcredit---that is, rerun the original methods on the original data in order to put the results under an econometric microscope. Most of the people whose work I have replicated have been gracious and helpful. Some have been resistant, even belligerent. Overall, it's not a great way to make friends.So the abstract of this 2009 report by Bruce McCullough and Ross McKitrick on academia's lack of transparency resonated with me:
Empirical research in academic journals is often cited as the basis for public policy decisions, in part because people think that the journals have checked the accuracy of the research. Yet such work is rarely subjected to independent checks for accuracy during the peer review process, and the data and computational methods are so seldom disclosed that post-publication verification is equally rare. This study argues that researchers and journals have allowed habits of secrecy to persist that severely inhibit independent replication. Non-disclosure of essential research materials may have deleterious scientific consequences, but our concern herein is something different: the possible negative effects on public policy formation. When a piece of academic research takes on a public role, such as becoming the basis for public policy decisions, practices that obstruct independent replication, such as refusal to disclose data, or the concealment of details about computational methods, prevent the proper functioning of the scientific process and can lead to poor public decision making.
Not only do authors often keep their data and computer programs secret, but journals, whose job it is to assure quality, let them get away with it. For example, it took two relatively gargantuan efforts---Jonathan Morduch's in the late 1990s, and mine (joining Jonathan) more recently---just to check the math in the Pitt and Khandker paper claiming that microcredit reduced poverty in Bangladesh. And it's pretty clear now that the math was wrong. But the study was in the prestigious Journal of Political Economy, so it was entirely understandable for Muhammad Yunus and others to cite it.McCullough and McKitrick describe almost a dozen instances where resistance to replication helped flawed research sway public policy. There is a political thrust to this report, which is unfortunate since the message ought to transcend the left-right divide. It's great to scrutinize studies that seem to show the world warming, for instance, but surely climate skeptics also have skeletons in their closets that ought to be outed. This asymmetric curiosity fuses a methodological point with a political one, subtly conveying the bias that only science that buttresses government activism deserves extra scrutiny.Still, I think the report makes a strong point about the need for transparency.

Disclaimer

CGD blog posts reflect the views of the authors, drawing on prior research and experience in their areas of expertise. CGD is a nonpartisan, independent organization and does not take institutional positions.

Topics