This is a joint post with Julia Clark.
This Thursday, the World bank will host the unveiling of the latest edition of the best-known ranking of think tanks, which is produced by the University of Pennsylvania. The public event will reveal whether the Brookings Institution has lost its hold on "Think Tank of the Year," which tanks made the top 50 worldwide, which are best in Latin America, and so on.
As with the Oscars, the verdicts of the Global Go To Think Tank (GGTTT) Index are rendered on the basis not of performance measurement, but on the perceptions of those in the business, in this case hundreds of journalists, policymakers, and think tank employees. That approach may be one reason expert perceptions of the GGTTT index itself have tended to be highly critical (here, here, here, here, here). Among the concerns: the opacity of the ranking process, the inclusion of institutions that are not think tanks in any usual sense of that term, and fundamental doubts about what it means for a member of such a diverse class to be "the best." Yet there's no doubt the results turn heads each year. If the criticisms from think tank experts are right, then the GGTTT may be distorting the behavior of think tanks, as they strive to raise their standing on dubious metrics, as well as misleading think tank funders.
The combination of continuing criticism and continuing interest made us wonder: could we do better, or at least illustrate the possibility of doing better? Last November we posted indicators of think tank "profile", looking at how often a tank's work is reported, cited, downloaded, or followed. Here, after tweaking and updating the indicators, we blend them into a single index for ranking, which we easily do by drawing upon methods we honed over 10 years for the Commitment to Development Index.
Let us be clear: CGD does not intend to enter the tank-ranking business long-term. As a think tank, we have a dog in this fight and lack the necessary objectivity. Indeed, we acknowledge that our focus on public profile may bias the results in CGD's favor, since public outreach is central to our strategy. Moreover, as we wrote last time, web page hits, media mentions, and scholarly citations are just a subset of the characteristics that can make a think tank effective. Thus the "profile" in our title. Some tanks succeed precisely by flying below the radar. Our purpose is to stimulate and improve the discourse around think tank performance.
These caveats notwithstanding, like many in the think tank community, we are data geeks, committed to the use of objective evidence in our research. We believe that any effort to rank the tanks should begin by gathering the best available data. In the spirit of offering some much-needed competition to the GGTTT, let’s take a closer look at the data.
We start with an update of the last post's first table, covering noted US institutions. Major changes in our data collection include a switch from Google News to Nexis for media mentions counts, the latter having proved more stable from week to week; the use of academic citations of papers published in 2010 instead of 2012, since 2012 is too recent for much mention to have accrued; and the exclusion of Human Rights Watch and NBER as not being think tanks despite being listed as such in the GGTTT:
Indicators of aggregate profile for major US think tanks
Next we turn those numbers into scores by multiplying each performance indicator column by a scaling factor chosen so that an average performer scores exactly 5 (and a twice-average one gets a 10). This puts most scores in the intuitive 0--10 scale. Finally, we average the five categories to get the overall scores in the last column of the table below.
The conservative stalwarts Cato and Heritage perform best in social media and web presence, while Brookings leads in media mentions and scholarly citations---but as we'll explain in a moment, we don't view these rankings as our most useful results, preferring to order another way:
Scores of aggregate profile for major US think tanks (5 = average)
(In the spreadsheet, you can change the weights on the five indicators.)
The next two tables are like the first two except that they divide all performance indicators by annual spending: not total media mentions, for example, but media mentions per dollar of budget. These results seem more practically relevant because a donor of $100,000 is probably less interested in an institution's aggregate profile than its ability to build following per dollar spent. By that measure, the Cato Institute, the Pew Research Center, and Peterson Institute have been most effective:
Indicators of profile per dollar spent for major US think tanks
Scores of profile per dollar spent for major US think tanks (5 = average)
Ranking by profile per dollar spent of major US think tanks
These standings differ from those of the Global Go To Think Tank index. Cato is third in aggregate profile (second table above) and tops in profile/dollar (fourth table and graph above) but only 14th on the 2011 GGTTT. Last year's 'Think Tank of the Year," Brookings, score just 5.5 (5 being average), placing it 7th out of 18.
To defend the GGTTT against this seeming contradiction by real data, one could argue that the GGTTT index covers more than public profile: it encompasses any performance attribute the raters think of, such as timeliness or creativity of policy proposals. Recognizing this argument, for a more meaningful comparison, we plot our results against corresponding rankings in the GGTTT report.
This next chart compares the average of our first three indicators (per-dollar social media fans, web traffic, and incoming links) with an institution's rank on the GGTTT list of "Think Tanks with the Best Use of the Internet or Social Media to Engage the Public." (It drops tanks that don't make the GGTTT list.) If the two approaches agreed, we would see the dots cluster roughly along a line from bottom left (low score from us, high rank number from GGTTT) to bottom right (high score, low rank number such as #1). Some do that, but others are far from the diagonal line of good agreement:
Internet/Social media: GGTTT rank vs. CGD index
This is a similar plot for print and electronic media; it compares our media-mention scores per dollar to the GGTTT "Think Tanks with the Best Use of the Media (Print or Electronic) to Communicate Programs and Research." Here the correlation is even weaker:
Print/electronic media: GGTTT rank vs. CGD index
Our small exercise falls short of an objective ranking of think tanks. Still, we can’t help but believe that expert and popular understanding of think tanks would improve substantially if some of the energy currently put into annually cajoling hundreds of experts to rate thousands of institutions each year were redirected into collecting and analyzing objective, empirical measurements of think tank performance.
A spreadsheet with all results shown above, plus the same for the GGTTT's "international development" think tanks, is here. Perhaps some think tank funders will want to pursue this work. If so, your comments below could contribute to their own thinking about think tanks.