I’ve been hesitant to write about this, since every time I try I just end up with phrases like “AN EXCEL ERROR ROFL!!!!” on the page, and I don’t feel like that entirely embodies what I would like to convey regarding the Reinhart-Rogoff situation. So I gave it the old college try on About.com instead. Some highlights:
- Harvard economists Carmen Reinhart and Ken Rogoff wrote a short piece in the (non peer-reviewed) American Economic Review Papers and Proceedings entitled “Growth in a Time of Debt.”
- Said piece asserted that debt to GDP ratios of over 90 percent are associated with negative average economic growth, whereas lower levels are all associated with positive average growth. (Interestingly, however, the 90 percent debt ratio is correlated with positive median economic growth, which suggests the negative result is being driven by outliers to some degree.) In the piece, the authors were careful to give the “correlation does not imply causation” disclaimer- in this case, that it’s possible that slow economic growth causes high debt ratios rather than the other way around. In other words, their data analysis looks like this:
(Sidenote: I know hindsight is 20-20 and all, but wouldn’t you at least be a little suspicious about the presence of a calculation error when you get a result that is that different from the others?)
- Advocates of economic austerity apparently understand neither the correlation versus causation issue nor the concept of peer review and basically nutted themselves over this finding.
- Said austerity supporters started inviting Carmen and Ken (though mostly Ken, as I understand it) to fancy policy discussions. In addition, Carmen and Ken wrote a few op-eds where they were much more prescriptive and less careful about highlighting the lack of established causality.
- UMass Amherst graduate student Thomas Herndon became suspicious when he couldn’t replicate Reinhart and Rogoff’s result using data they provided to complement their 2009 book. (Fun fact: I suggested that my senior seminar students use this data to write their research papers and they all ignored me.) Herndon then got Reinhart and Rogoff to share their calculations and found that their most noteworthy finding was the result of a sloppy Excel formula. (Sidenote: ROFL!!!! Not even Matlab? Stata? It’s not the most relevant issue, but the JV nature of the analysis is hilarious to me.)
- Herndon and two professors published a rebuttal to the Reinhart-Rogoff paper in which they pointed out the Excel error and also challenged some of the other methodology in the original study. (Sidenote: I find it funny that a 6-page non-peer-reviewed paper got a 26-page presumably non-peer-reviewed rebuttal.)
Hold up…I’ll pause here because Stephen Colbert tells is so much better:
Colbert even invited Herndon on the show, which not only made me incredibly jealous but also caused me to send a “neener-neener” email to my students for not taking me up on my paper advice:
Okay, so let’s continue…
- Reinhart and Rogoff issued what I will loosely call an apology via NYT op-ed. The general gist of the piece is “yeah, we made a calculation error, but it’s not that important for the end result. Besides, we warned you against interpreting the findings as causal anyway.”
Brad Plumer at the Washington Post was kind enough to put the disagreement into handy graph form:
To be fair, the difference between the numbers is the result of all of Herndon et al’s criticisms, not just the spreadsheet error. That said, the spreadsheet error is the difference between negative and positive growth for the 90 percent and over group.
So what do we take from this? I guess it’s a matter of opinion whether the amended result (together with the causality issues) is an argument for austerity. Therefore, I feel that the takeaways from this debacle are a bit more general:
- Basing major policy decisions on short, non-peer-reviewed studies might not be the best idea anyone’s ever had. In related news, it’s important to be aware that confirmation bias- the tendency to only pay attention to evidence that supports one’s existing hypothesis- is a powerful force.
- It’s just as important to be a good consumer of research as it is to be a good researcher. Betsey Stevenson and Justin Wolfers have some good advice on the subject. I would add to this that, if you are in a position to actually use a set of findings, it’s worth doing a little digging to see how well-vetted the findings are.
- As a corollary to the above, it would probably be helpful for academic publications to make it more obvious when work is and is not peer reviewed. I’m not sure exactly why I know that the American Economic Review’s “Papers and Proceedings” are not peer reviewed, so I wouldn’t expect that to be common knowledge.
- ALWAYS, ALWAYS check your calculations. Better yet, use a real statistical/math program like Stata or Matlab where you can execute calculations via scripts so that your work is far more auditable.
On the up side, no one really thinks that Reinhart and Rogoff had any sort of nefarious intent, so at least they aren’t having as bad of a time as this guy.
Update: Reinhart and Rogoff have put out an official errata report for their paper. Fun fact: it’s almost twice as long as the original paper.