Tag: impact factor

Measuring Journal (and Scholarly) Outcomes

Another day, another piece chronicling problems with the metrics scholars use to assess quality. Colin Wight sends George Lozano’s “The Demise of the Impact Factor“:

Using a huge dataset of over 29 million papers and 800 million citations, we showed that from 1902 to 1990 the relationship between IF and paper citations had been getting stronger, but as predicted, since 1991 the opposite is true: the variance of papers’ citation rates around their respective journals’ IF [impact factor]  has been steadily increasing. Currently, the strength of the relationship between IF and paper citation rate is down to the levels last seen around 1970.

Furthermore, we found that until 1990, of all papers, the proportion of top (i.e., most cited) papers published in the top (i.e., highest IF) journals had been increasing. So, the top journals were becoming the exclusive depositories of the most cited research. However, since 1991 the pattern has been the exact opposite. Among top papers, the proportion NOT published in top journals was decreasing, but now it is increasing. Hence, the best (i.e., most cited) work now comes from increasingly diverse sources, irrespective of the journals’ IFs.

If the pattern continues, the usefulness of the IF will continue to decline, which will have profound implications for science and science publishing. For instance, in their effort to attract high-quality papers, journals might have to shift their attention away from their IFs and instead focus on other issues, such as increasing online availability, decreasing publication costs while improving post-acceptance production assistance, and ensuring a fast, fair and professional review process.

Continue reading

Against Journal Impact Factors

There’s a fascinating post making the rounds. In it, Stephen Curry discusses the history and abuse of journal impact-factor data. Curry links alternatives to the rise of open-access publications and new-media discussion of research findings. 

Twenty years on from Seglen’s analysis a new paper by Jerome Vanclay from Southern Cross University in Australia has reiterated the statistical ineptitude of using arithmetic means to rank journals and highlighted other problems with the impact factor calculation. Vanclay points out that it fails to take proper account of data entry errors in the titles or dates of papers, or of the deficient and opaque sampling methods used by Thomson Reuters in its calculation. Nor, he observes, does the two-year time limit placed on the impact factor calculation accommodate variations in the temporal citation patterns between different fields and journals; peak citations to Nature papers occurs 2-3 years following publication whereas citations of papers in Ecology take much more time to accrue and are maximal only after 7-8 years). Whichever way you look, the impact factor is a mis-measure.

His conclusion is harsh, but in my view, fair.

But every little helps, so, taking my cue from society’s assault on another disease-laden dependency, it is time to stigmatise impact factors the way that cigarettes have been. It is time to start a smear campaign so that nobody will look at them without thinking of their ill effects, so that nobody will mention them uncritically without feeling a prick of shame.
So consider all that we know of impact factors and think on this:

  • If you use impact factors you are statistically illiterate. 
  • If you include journal impact factors in the list of publications in your cv, you are statistically illiterate.
  • If you are judging grant or promotion applications and find yourself scanning the applicant’s publications, checking off the impact factors, you are statistically illiterate.
  • If you publish a journal that trumpets its impact factor in adverts or emails, you are statistically illiterate. (If you trumpet that impact factor to three decimal places, there is little hope for you.)
  • If you see someone else using impact factors and make no attempt at correction, you connive at statistical illiteracy.

I’ll try to have more to say about this, and its evil cousin, citation counting, at some near-future time.
Also, see Curry’s “Coda

© 2019 Duck of Minerva

Theme by Anders NorenUp ↑