Human Ecology impact factors
The recent sting operation carried out by John Bohannan of Science focuses needed attention on the shadiest practices of academic publishing. As described in his great article, Bohannan submitted a faked cancer study, riddled with errors, to 314 open access journals. The sad outcome is that 157 of the journals accepted the paper for publication.
I was curious to find out whether Journal Impact Factors, a commonly used quantitative measure of a journal’s influence (and thus prestige), would predict whether the fake article would be accepted for publication or not. Lucky for me and anyone interested in this topic, Bohannan and Science have made a summary of their data available for download.
I cross-referenced the list of journals that received the spoof paper to the Thompson Reuters Journal Citations Reports (JCR) Science Edition 2012 database. I found that the great majority of the journals that received the article are not even listed or tracked in the JCR Science database: only 44 (14%) of the journals had an entry in the JCR database. While 57% of the journals without a listing in JCR Science accepted the paper, only 7% (n=3) of those in the database did so. The two pie-charts below illustrate the outcomes of submissions to journals listed and unlisted by the JCR Science database:
Since only 3 of the journals listed in the JCR Science database accepted the paper, it is difficult to make generalizations about the characteristics of such journals. Bohannan includes in his article excerpts of emails with the editor of one of these journals, Journal of International Medical Research, who takes “full responsibility” for the poor editing.
With the caveat that samples sizes are really small, I compared the 2012 impact factors of listed journals who accepted the paper (n=3) with those who rejected it (n=34). The median impact factor of accepting journals was 0.994 and those who rejected was 1.3075 (p