March 08, 2010
Revenge of the Nerds!
Much to my amusement, this post on the utlity of quantitative analysis caused quite a stir in the international relations blogosphere. I don't know if folks in security studies just don't have a sense of humor or if it's true what Kissinger said about how university politics are vicious precisely because the stakes are so small. But what I think happened is that Stephen Walt read my post, chuckled, and his chuckling did two things: 1) it brought a lot of people to this site who were not aware that the posts on this blog are meant to be light and irreverent, and 2) it opened up an old fault line in security studies between traditionalists like Walt who aren't so impressed by quantitative analysis and the Young Turks and political economists who have pushed to make it ascendent in political science departments across the United States. I have about as much interest getting involved in these scholarly disputes as I do catching the Ebola virus. But I did find some of the reaction pretty amusing. Like the fact that Hein Goemans, a brilliant scholar at the University of Rochester, was writing comments on my blog at 5:17 on a Friday afternoon. (Hein, buddy, it's happy hour. Put down the TI-89, get off the internets and go drink a beer.) Or the fact that Cranky Dan Drezner was left in a cursing, sputtering rage over at his Foreign Policy blog. (I was particularly hurt that Drezner didn't see the humor in my post, as I have always found his willingness to hold forth on the peoples and politics of the Arabic-speaking world and Iran without any time spent in the region or training in its languages to be hilarious.)
In the end, though, I commissioned one of this blog's regular readers, "Scott Wedman", to write a response to what I had written. What follows is good stuff. I am sorry that folks got their proverbial panties in a twist about a post that was meant to be funny, so hopefully this will make up for things. (Though curses to you all for making me publish something serious minded.)
Plenty of people have already weighed in on AM’s “Quantitative Manifesto”, including Drezner, Walt, Farrell, and others (including but not limited to Drew Conway, Justin Logan, and Kindred Winecoff).
Since others have covered many of the specifics in depth, I’ll limit myself to four broad points that I think those even vaguely interested in these issues should consider (and feel free to disagree with). Just so you know where I’m coming from, I’m an assistant professor of political science at a research university who primarily publishes on international conflict and security issues. I use both qualitative and quantitative research methods. I have also done some work that is better defined as policy relevant or even policy analysis.
First, good research is good research, regardless of method. Just criticizing one method or another out of hand is short-sighted because the more important thing is encouraging good research methods overall. While that sounds trite, it’s true. Good work asks an interesting question, utilizes new evidence or methods to answer the question, and is appropriately modest in its conclusions. Good work can be qualitative, quantitative, or game theoretic. Frankly, lots of research isn’t good work, but there’s quite a bit of good stuff out there. And much good work follows a lot of AM’s manifesto, though not all of it. What’s important is that people from across the methodological spectrum be open to sources of evidence and argumentation that fall outside what they may utilize in their research, but that may shed light on a topic of interest. Of course, that’s easier to say in theory than in practice.
Second, there is a difference between empirical social science research and policy analysis. Social science research, which lays out theories/hypotheses and then (mostly) uses evidence to test those theories/hypotheses, is potentially a useful input for those interested in specific policy recommendations. Good social science research suggests what is most likely to happen in a certain situation, based on what has happened before in similar situations. But that’s not a substitute for specific, in-depth information on the question of the day, whether it’s the consequences of implementing new sanctions on Iran, whether or not Obama’s surge in Afghanistan is likely to succeed, or something else. Social science research is one tool in the policy maker’s toolkit. And perhaps, as Drezner and others have argued, it should be used more often. But it’s not, and it shouldn’t be, the only tool.
Third, there are important benefits to using quantitative methods in international relations/security studies. The simplest is just that there are often competing theories or arguments drawn from qualitative studies on topics like the effectiveness of economic sanctions or the link between different types of political regimes and success on the battlefield. Quantitative analysis helps scholars systematically evaluate those competing claims by seeing how they fare when tested on dozens or hundreds or thousands of cases instead of just a handful or fewer (quantitative scholars will argue among themselves as well, but you get the drift). Of course that doesn’t mean a political science professor knows more about how to take a hill or how to secure a village than someone in, you know, the military, but it does mean those scholars are (hopefully) producing valuable knowledge that is based on more than their (or any one person’s) personal experience.
Fourth, international relations and security studies were traditionally very hostile to quantitative methods and formal models (the old security studies = realist = qualitative idea used to rule the day), but most of the best scholars these days use multiple methods, usually meaning qualitative analysis and either quantitative analysis or formal models. Sometimes they use all three. However, the move to multiple methods is generally not a crass ploy to get published or get tenure (and I don’t think AM meant to imply that it was). It’s a genuine recognition on the part of many scholars, and especially younger scholars, that the more tools you have in the box, or clubs in the bag, or whatever the analogy, the more evidence you can bring to bear to answer a question. And there’s no reason to exclude a type of evidence when it can help give you a new perspective on a question. There are also some questions better answered through qualitative analysis, some through quantitative analysis, and some through formal models. So the more methods you know and can use, the more interesting and varied questions you can answer. That’s just smart.