Where to find journal impact factors stripped of self-citation?

Update from March 2016: it seems like ISI Knowledge offers citation counts with person-level self-citations taken out. Any of that should be treated as experimental, of course. Don't know if this could work for a journal.


I don't think any of the existing systems take out the self-citations. The publishers and journals are not interested in seeing reduced impact factors, so few editorial boards and fewer yet commercial publishers would be interested in anybody producing such rankings. If a discipline has its independent referencing and citations systems, they might be interested in such more objective analysis -- e.g., economists have their CitEc (Citations in Economics), a part of RePEc (Research Papers in Economics), which does track self-citations (see the 2012 Nobel prize winners, Alvin Roth and Lloyd Shapley as examples... the first one has as many self-citations as the second one, total citations). I would venture a guess that mathematicians might have a similar system. But I doubt that natural or social sciences do.

The high impact journals are obviously important for publishing, getting good academic jobs and getting tenured. However, impact factor may only tell a part of the story, and some disciplines have reputable journals that may not have the highest possible IFs. Let me take again economics as an example that I am familiar with. In most US departments, you'd get tenured if you have a paper in either American Economics Review, Quarterly Journal of Economics, or Econometrica. (The QJE is often said to mostly publish MIT and Harvard folks.) AER is only 19th in this list of impact factors (which may be as good as any other list that's not password protected by ISI), and I have never heard about some of these journals. The AER's impact factor of 2.5 is not even funny for a biologist seeking Science or Nature publications, though -- the impact factors of the latter are what, 30 or so?

I work in industry and tend to care little about the journals outside of my area (statistics); generally, statistics journals tend to have IFs between 0.25 and 4. The reality of my particular field (survey research) is that people just submit their paper to proceedings of the annual conferences and move on with their paying projects, and don't have the time to BS back and forth with the reviewers. There are people in academia who need to publish-or-perish, so you would see some typical academic papers with rather small contributions to the knowledge, but their authors either have their own time or slave labor force grad students to write these up.

As a guiding rule of what journals to avoid, you can start with commercially published journals and respect the professional organizations publish their journals themselves, without Wiley or Elsevier grabbing them as a source of income... although I can imagine that coercive citations is what the editorial boards insist on, which may or may not correlate strongly with who publishes a given journal. I have received requests to cite the given journal more (this is a problem for disciplines with an overproduction of journals fighting each other; again, in my industry, there are probably four or five decent ones, and they don't need to fight), but tended to ignore them.


The SciVerse Citation Tracker service, available to anyone with a Scopus subscription, allows it to exclude self-citations. However, it is not designed to evaluate journals, but rather to review citations to a particular author or field. Both the Journal Impact Factor (IF) computed by Thomson Reuters and the Scientific Journal Rankings (SJR) computed in Scopus by Elsevier include self-citations. I am not aware of any other database that excludes them.

And yet, I wouldn't use any bibliometric indicator as a measure of a journal's reputation, but rather trust the advice of experienced colleagues.


Let me add a new answer to an old question for posterity's sake. Thompsons Reuters Journal Citations recently (I believe in the past year or two) started making this information available. See an example below:

Thompsons Reuters data for IJNSNS

Of course making this data available does not address the "citation cartels" mentions in Anonymous Mathematician's answer, or editors writing papers with rampant citations to their journals, as in the case of IJNSNS discussed here. The data above is for the IJNSNS case. (Note the editorial board has changed, which I imagine coincides with the decline in impact factor.)