2008 Journal Citation Reports figures released

Last Friday Thomson Reuters released the 2008 edition of the Journal Citation Reports. This year it was announced by Thomson itself as a news release, that’s a good move from them. The number of journals reported in the two editions of the JCR have increased from 6417 in the Science edition to 6598 (181 more journals that is) and in Social Sciences edition the number of journals covered increased from 1865 to 1980 (an increase of 115 journals).It is still not the increase I expected on the basis of the addition of some 750 new regional journals which was announced last year, and that figure is now even advertised as an expansion of 1228 journals, but it is still an expansion of 300 journals. Albeit reading Thomson’s press releases on the 2008 JCR update I still notice some juggling with numbers that don’t really add, or don’t make sense after simple investigations when comparing the 2007 and 2008 issues.Now we have to go and figure out which were added, and more important, which journal were dropped. That’s always interesting to find out. It will take time though.The really major improvement Thomson should make, is to abolish the rather odd division between the two parts of the database. Currently I can’t find any arguments to stick to the demarcation lines between the Science edition and the Social Science edition of the JCR. I really wonder how many customers they have that subscribe to only one part of the JCR. I think it is fair to assume that by far most of the customers subscribe to both parts.For teaching it is just a pain, to have to explain students that they should start their search with choosing a database part. That is far from intuitive.

Journal quality, an unexpected improvement of the JCR

It is odd to say, but for researcher the journal as an entity is disappearing. Scientist search for information in online databases and select from title and abstract information whether the article suits their needs. The days that scientists visited the library and browsed the table of contents of the most important journals to keep up with their field have long gone .

Still there is a lot of emotion around journals titles. Scientist want to publish their research in the best possible journal. Earlier this year the NOWT (2008) published a report on the performance of Dutch universities and there it was clearly shown that field normalized citation impact for each university correlated positively with the field normalized journal quality.
Journal quality versus Citation impact

Looking at this graph it is clear that there is considerable reason to selected the best journals in their field to publish your results. However, until recent the only widely available journal quality indicator has been the journal impact factor. There has been a lot of criticism on the uses and abuses of impact factors, but they have stood their time. All scientists are at least aware of impact factors. For years ISI, Thomson Reuters were in fact the sole gate keepers of journal quality rankings.

Over the last years a number of products, free and fee based, have tried to come up with new and competing journal ranking measures. SicmagoJR (based on Scopus data), journal analyzer from Scopus, Eigenfactor.org and the data from Thomson’s own Essential Science Indicators of course.

This week Thomson Reuters announced that they will update the journal citation report. From the 1st of February we get a entirely new Journal Citation Report. From the press release:

  • Five-Year Impact Factor – provides a broader range of citation activity for a more informative snapshot over time.
  • Journal “Self Citations” – An analysis of journal self citations and their contribution to the Journal Impact Factor calculation.
  • Graphic Displays of Impact Factor “Box Plots” – A graphic interpretation of how a journal ranks in different categories.
  • Rank-in-Category Tables for Journals Covering Multiple Disciplines – Allows a journal to be seen in the context of multiple categories at a glance rather than only a single one.

It is highly unusual to see two updates per year for JCR. But it is interesting to to note how they are moving under the pressure of some competition.

Literature:
NOWT (2008). Wetenschaps- en Technologie- Indicatoren 2008. Maastricht, Nederlands Observatorium van Wetenschap en Technologie (NOWT). http://www.nowt.nl/docs/NOWT-WTI_2008.pdf (in Dutch)

Thomson Reuters issues a press release on the JCR 2007

It just popped up in my RSS feed on the second of July. The official press release from Thomson Reuters is dated July 1st, announcing the new edition of the Journal Citation Reports 2007. There is no further mention of the new journals included or excluded. There is a link to the official promotional website of JCR, which still states:

  • Covers more than 7,500 of the world’s most highly cited, peer-reviewed journals in approximately 200 disciplines
  • The Science Edition covers over 5,900 leading international science journals from the Thomson Reuters database
  • The Social Sciences Edition covers over 1,700 leading international social sciences journals from the Thomson Reuters database

It actually struck me today that the journals included in JCR are not listed at their Master Journal List.

Shall we call it progress that Thomson is confirming what I blogged about some two weeks ago?

On Impact Factors and article quality

I just found this quote:

Thus, while it is incorrect to say that the impact factor gives no information about individual papers in a journal, the information is surprisingly vague and can be dramatically misleading.

(Adler et al. 2008)

The report is a very critical discussion about the use and abuse of impact factors and the h-index.

Reference:
Adler, R., J. Ewing, et al. (2008). Citation Statistics : A report from the International Mathematical Union (IMU) in cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and the Institute of Mathematical Statistics (IMS), Joint Committee on Quantitative Assessment of Research. 26p. http://www.mathunion.org/fileadmin/IMU/Report/CitationStatistics.pdf

Hattip: Sidi

Article impact and journal impact factors

In the scientometric literature we are very often warned not to use journal impact factors to judge the performance of researchers or research groups. For this statement I always refer back to Seglen (1997). Seglen showed that only 50% of the articles in three chemistry journals contributed to 90% of the citations to those journals, i.e. the other half of the articles only contributed to 10% of the citation impact. It is one of those illustrations of the long tail of scientometrics.

In my courses on citation analysis I point always to this fact, and elaborate on the use of journal impact factors in journal selection as part of a publication strategy. Choose the highest impact factor journals to submit your best work is a simple advice.

In the latest analysis by the NOWT of research performance in the Netherlands, my university is placed of the second division of Dutch universities ranked by citation impact. One of the points the report made quite clear was that the field corrected journal impact of the articles was far below the national average. Actually, it was only the second worst university in this respect, only Tilburg University feared worse. (NOWT 2008, table 4.5 on p.40).

I think there is a necessity to pay more attention to this fact at our university. In a informal citation analysis for one of our chair groups I am going to elaborate this point a bit further.

Relative impact versus Journal impact factors

If you look at the relative impact of their articles published the period 1998-2005 and the journal impact factors you get a large scatter diagram. If you want to draw a regression line, it seems a bit meaningless. The slope is just positive, but the R² is only 0.0048. The problem is of course that the relative impacts of the articles are far from normally distributed. The average of the relative impacts per article is 1.35, whereas the median is 0.92. Most articles have a relative impact below world average. If you calculate the average article impact for this group as the sum of citations divided by the sum of the baseline citations the relative impact is 1.28.

For me the picture became much clearer when I drew the lines for the median citation impact and the median journal impact. If you look at the articles below the median citation impact line, most articles are concentrated in the lower journal impact factor quadrant 36 versus 16. Of the higher impact articles most articles are concentrated in higher journal impact factor quadrant, 35 against 16. Actually those 35 articles were published in only 14 different journals.

Relative impact versus Journal impact factors with the median lines

Perhaps this research group should focus their publication output on those 14 journal titles, and stay away from the 21 journals associated with lower left quadrant. I found this approach quite revealing.

References
NOWT (2008). Wetenschaps- en Technologie- Indicatoren 2008. Maastricht, Nederlands Observatorium van Wetenschap en Technologie (NOWT). http://www.nowt.nl/docs/NOWT-WTI_2008.pdf
Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. BMJ 314(7079): 497-502. http://bmj.bmjjournals.com/cgi/content/full/314/7079/497