Yeah, I don’t really know either. I always hear the expression ‘SSCI’ thrown around as the gold standard for social science work. Administrators seem to love it, but where it comes from and how it gets compiled I don’t really understand. Given that we all seem to use this language and worry about impact factor all the time, I thought I would simply post the list of journals for IR ranked by impact factor (after the break).
I don’t think I ever actually saw this list before all laid out completely. In grad school, I just had a vague idea that I was supposed to send my stuff to the same journals whose articles I was reading in class. But given that I haven’t found this list posted on the internet anywhere, here it is. I don’t know if that means it is gated or something, or if my school has a subscription, or whatever. Anyway, I thought posting the whole IR list would be helpful for the Duck readership.
But I have a few questions. First, why does Thomson-Reuters create this? Why don’t we do it? Does anyone actually know what they do that qualifies them for this ? And don’t say ‘consulting’ or ‘knowledge services’ or that sort of MBA-speak. The picture above includes some modernist, high-tech skyscraper, presumably to suggest that lots of brilliant, hi-tech theorists are in there crunching away big numbers (but the flower tells you they have a soft side too – ahh), but I don’t buy it. Are these guys former academics who know what we read? Who are they? Does anyone know? The T-R website tells you nothing beyond buzzwords like ‘the knowledge effect’ and ‘synergy.’ I am genuinely curious how T-R got this gig and why we listen to them. Why don’t we make our own list?
Anyway, I don’t really know, so I just thought I’d throw it out there. Check the IR rankings below.
More questions:
I am not sure if the SSCI and the Journal Citation Reports from T-R are different or not or what. Click here to see the SSCI list; and here is the JCR link, which is probably gated, but ask your administration; they probably have access. There are 3038 journals in the whole SSCI list (!), 107 listed under political science, and 82 under IR. There is some overlap between the last two, but the PS list does not completely subsume the IR list, as I think most of us would think it should. For example, IS is listed only under IR, not political science, but ISQ is listed under both, even though I think most people would say IS is a better journal than ISQ. Also, there is no identifiable list for the other 3 subfields of political science. I find that very unhelpful. More generally, I would like to know how T-R chooses which journals are on the SSCI and which not. It doesn’t take much effort to see that they’re almost all published in English…
Next, I thought the SSCI was only peer-reviewed, but Foreign Affairs and the Washington Quarterly (which I understand to be solicited, not actually peer-reviewed – correct me if I am wrong) are listed on the IR list, and even Commentary and the Nation magazine are on the PS list. Wow – your neocon ideological ravings can actually count as scholarship. Obviously FA should be ranked for impact factor; it’s hugely influential. But does it belong on the SSCI? Note also that ISR is listed on the IR roster, as is its old incarnation, the Mershon ISR. Hasn’t that been gone now for more than a decade? Also when you access the impact factors (below),T-R provides an IR list with its ‘Journal Citation Reports’ that has only 78 journals listed for IR, not 82. So the SSCI for IR (82) does not quite equal the JCR for IR (78). Is that just a clerical error? If so, does that mean the super-geniuses in the futuristic skyscraper are spending too much time looking out the windows at the flowers? I guess if you double-count M/ISR, you get 79, which is pretty close to 82, but given how definitive this list is supposed to be, it seems like there are problems and confusions.
2010 is the most recent year T-R provides a ranking, so I used that, plus the rolling 5-year impact factor. The ranking on the left follows the 5 year impact factor, not the 2010 one.
A few things leap out to me:
1. How did International Studies Perspectives rocket up so high in less than 15 years, higher than EJIR, RIPE, and Foreign Affairs? Wow. I guess I should read it more.
2. What is Marine Policy (no. 11) and how did it get so very high also?
3. Security Studies at 27 doesn’t sound right to me. We read that all the time in grad school.
4. A lot of the newest ones, at the bottom without a 5-year ranking, come from Asia. That isn’t surprising, as Asian countries are throwing more and more money at universities. That’s probably healthy in terms of field-range, to move beyond just Western-published ones.
5. Why haven’t I ever even heard of something like half of these journals? I guess we really are a hermeneutic circle – reading just the same journals again and again – APSR, IO, IS, ISQ, EJIR. That’s pretty scholastic when this IR SSCI list shows a rather interesting diversity I never have time to read. A shame actually…
Rank                Title                         2010 Impact Factor     5-Year Impact Factor
1. It’s balloons, not a flower.
2. IS is not > ISQ.
3. We *do* do this “ourselves,” via Google Scholar.
1. Are you sure? Isn’t that a flower stem on the left? And aren’t balloons an even more bizarre add-on than the flowers?
2. Han > Lando.
3. Sure, but the impact factor, which journals obsess about so much that they place it front-and-center on their homepages, comes from JCR.
Those are balloons floating upwards, suggesting success.
One of my PhD students (Helen Turton) is looking into all this. It’s pretty much a ‘dark art’, about how you get on the Thompson list. There are also pretty easy ways to manipulate the citation rate if you are really that bothered, and there’s a good reason why certain journals move up quickly (and down just as quick).
I am glad you say that too, because impact factor seems like such a big deal (lots of journals post their JCR score front-and-center on their homepages), but I have no good idea of how it gets done. I also find it depressing to hear that citation rate can be gimmicked to move up. I’ve heard that before but have no real sense of how that is done or how it can be policed. I would be very curious to hear, please. Thx.
Easiest way to improve impact factor would be for editors to insist that new submissions cite previously published articles in that journal. The raw impact factor does not discount same journal citations. T-R has developed other metrics that respond to this, but impact factor has brand recognition.
Yeah. In fact, I have even heard that citing a journal in your footnotes raises the liklihood that that very journal will then accept your submission. (In fact, I think read that factoid here on the Duck.)
Which is precisely why these “stats” are problematic — they skew our research and the discipline into a purely careerist venture.
That seems like sloppy reasoning to me. Careerism exists independent of, and prior to, the construction of these measures. Not trying to split hairs. If this really is a problem, then it’s worth getting the diagnosis right…
Fair – careerism isn’t wholly borne out of journal rankings, that much is obvious. Â Let’s instead say that these statistics are used as tools and thereby enable and encourage publishing in certain journals over others, a decision pegged to the journal’s ranking, all in the grand project of achieving tenure.
This is exactly why I posted this list and asked about all this. I have the strong impression that these JCR rankings occupy a lot of our time and influence our decisions, sorta like the US New & World Report rankings of colleges has become so influential that colleges ask their best entrants to re-take the SAT just to improve their overall SAT score for improving their US News ranking. Even IO apparently engages in this sort of inflation (QS’ comment below), which I find really depressing to hear…
Possibly. But I still think the order is off. There was pressure to publish in APSR/AJPS/JOP as well as IO/JCR before people became obsessed with impact factors. To the extent that other outlets win the impact factor game, might it not actually broaden the range of outlets considered to be “worthy” by leading research departments?
That’s actually a really good insight. I noted something like that too when I first looked at the list – I didn’t recgonize half of those journal titles. There’s probably some pretty cool stuff in there, but I’d never know, because I read the ones TRIP says I should all the time. Not sure what the alternative is though, because we can’t read everything.
Getting authors to cite pieces in the journal is one way. But soliciting articles from ‘names’ is another, or publishing very topical stuff, or review or, survey articles, special issues etc. The wikipedia (did I really say that in public?) page on impact factor has a good discussion about manipulation.Â
Is there anyway to police this? Does T-R (or APSA, perhaps more importantly) know about this sorta thing? I read the wiki write-up. Funny how the one journal purposefully cited itself en masses in order to protest the sytem. I wonder if this means Google Scholar is the future…
T-R has a calculation that excludes journal self-citations, but people don’t pay that much attention to it. The problem is that influential journals should expect self-citations in the first place. If you look at the change in impact factor from the two calculation, it can be somewhat dramatic but it doesn’t necessarily change a great deal.
The problem is that editors have it in their interest to promote self-citation, ex. IO editors requesting articles to “IO-ify” their footnotes.
I’m not sure I agree IS is better than ISQ. Also “There is some overlap between the last two, but the PS list does not
completely subsume the IR list, as I think most of us would think it
should.” I disagree actually. I think IR has paid a price for being a ‘subfield’ of PoliSci, so it is actually a bit refreshing to see some journals on their IR-own.
That’s an interesting insight: “I think IR has paid a price for being a ‘subfield’ of PoliSci, so it is actually a bit refreshing to see some journals on their IR-own.” What costse were you thinking of? Americanists dominating departments? I would agree with that. But I actually think PS helps ground IR and keep it from becoming like econ or soch.
This is an argument that people who do feminist IR engage in all the time. Most of the key journals in that field are not on that list (Millennium’s the only journal I can think of that publishes lots of feminist stuff but that made that list). I don’t see any of the IPE journals on this list either, and I suspect that if mostly what you do is environmental IR stuff than you’re not on that list either. And yet there are certainly people who are big names in each of these fields and who have a great deal of influence, even if that influence can’t be calculated using this standardized rubric. (Thinking alot about this as I prepare my tenure file and wondering if those who will be looking at it who are not IR people will know how to interpret this stuff. Somewhat worried that they won’t).
I just rechecked the last article on TRIP in ISQ: https://onlinelibrary.wiley.com/doi/10.1111/j.1468-2478.2011.00653.x/abstract.
(It’s also worth noting also that to get that link I passed through ISQ’s home page, which lists right in the middle of the page, its JCR rank and impact factor.)
Here are the journals they list as the “most influential,” but they aren’t ranked: American Political Science Review (APSR), American Journal of Political Science (AJPS), British Journal of Political Science (BJPS), European Journal of International Relations (EJIR), International Organization (IO), International Security (IS), International Studies Quarterly (ISQ), Journal of Conflict Resolution (JCR), Journal of Peace Research (JPR), Journal of Politics (JOP), Security Studies (SS), and World Politics (WP).
The APSA should do this, journal ranking, but won’t. It should also accredit graduate programs, but won’t.