There’s a fascinating post making the rounds. In it, Stephen Curry discusses the history and abuse of journal impact-factor data. Curry links alternatives to the rise of open-access publications and new-media discussion of research findings.
Twenty years on from Seglen’s analysis a new paper by Jerome Vanclay from Southern Cross University in Australia has reiterated the statistical ineptitude of using arithmetic means to rank journals and highlighted other problems with the impact factor calculation. Vanclay points out that it fails to take proper account of data entry errors in the titles or dates of papers, or of the deficient and opaque sampling methods used by Thomson Reuters in its calculation. Nor, he observes, does the two-year time limit placed on the impact factor calculation accommodate variations in the temporal citation patterns between different fields and journals; peak citations to Nature papers occurs 2-3 years following publication whereas citations of papers in Ecology take much more time to accrue and are maximal only after 7-8 years). Whichever way you look, the impact factor is a mis-measure.
His conclusion is harsh, but in my view, fair.
But every little helps, so, taking my cue from society’s assault on another disease-laden dependency, it is time to stigmatise impact factors the way that cigarettes have been. It is time to start a smear campaign so that nobody will look at them without thinking of their ill effects, so that nobody will mention them uncritically without feeling a prick of shame.
So consider all that we know of impact factors and think on this:
- If you use impact factors you are statistically illiterate.
- If you include journal impact factors in the list of publications in your cv, you are statistically illiterate.
- If you are judging grant or promotion applications and find yourself scanning the applicant’s publications, checking off the impact factors, you are statistically illiterate.
- If you publish a journal that trumpets its impact factor in adverts or emails, you are statistically illiterate. (If you trumpet that impact factor to three decimal places, there is little hope for you.)
- If you see someone else using impact factors and make no attempt at correction, you connive at statistical illiteracy.
I’ll try to have more to say about this, and its evil cousin, citation counting, at some near-future time.
Also, see Curry’s “Coda”
Daniel H. Nexon is a Professor at Georgetown University, with a joint appointment in the Department of Government and the School of Foreign Service. His academic work focuses on international-relations theory, power politics, empires and hegemony, and international order. He has also written on the relationship between popular culture and world politics.
He has held fellowships at Stanford University's Center for International Security and Cooperation and at the Ohio State University's Mershon Center for International Studies. During 2009-2010 he worked in the U.S. Department of Defense as a Council on Foreign Relations International Affairs Fellow. He was the lead editor of International Studies Quarterly from 2014-2018.
He is the author of The Struggle for Power in Early Modern Europe: Religious Conflict, Dynastic Empires, and International Change (Princeton University Press, 2009), which won the International Security Studies Section (ISSS) Best Book Award for 2010, and co-author of Exit from Hegemony: The Unraveling of the American Global Order (Oxford University Press, 2020). His articles have appeared in a lot of places. He is the founder of the The Duck of Minerva, and also blogs at Lawyers, Guns and Money.
0 Comments
Trackbacks/Pingbacks