The Impact Factor of Open Access journals

In the world of Open Access publishing the golden road has received a great deal of attention. At least this is what our researchers seem to remember. Of course there are other roads to open access, but I want to present the impact factors of the journals facilitating the golden road to open access. This blogpost lists all open access journals included in DOAJ and assigned an Journal Impact Factor in the JCR 2009. The reason for this, is that our researchers see publishing in open access journals as the simplest way of achieving open access to their work, but on the other hand they are required for judgement of the citation impact that they publish in journals covered by Web of Science and therefore the Journal Citation Reports (JCR).

In the past there have been studies on citation impact of the open access journals that have actually received a journal impact factor from Thomson Reuters Scientific (formerly ISI). The first was by (McVeigh 2004) followed by (Vanouplines and Beullens 2008) (in Dutch, and not openly accessible) and recently by (Giglia 2010). These consecutive studies showed an increasing number of open access journals that received an Journal Impact Factor from Thomson Reuters. McVeigh reported 239 OA journals for the JCR 2004, Vanouplines reported 295 OA journals for the JCR 2005 and Giglia reported 385 OA journals for the JCR 2008 (there are some methodological issues that make these figures not entirely comparable).

The pitfall of these studies is that although they showed interesting figures and additional analyses, none of these studies actually published the list of open access journals that received an impact factor. The sole purpose of this blogpost is to publish this actual list. The probable reason for the previous authors is that the impact factors are proprietary information from Thomson Reuters. You are not allowed to publish these figures. On the other hand most publishers, use it in all their marketing outings for their journals. So the journal impact factor is virtually information in the public domain.

To avoid any intellectual property problems with Thomson Reuters I have included the ScimagoJR and Scopus SNIP indicator for the journals rather than the Journal Impact Factor. The correlation for this set of journals between SNIP and IF was 0.94 and between SJR and IF was 0.96. In total 619 journals from DOAJ were present in the JCR 2009 report (Science and Social Science & Humanities version deduplicated). The growth in journal coverage is due to the growth in OA journals and the significant expansion of journal coverage in 2008. On the other hand looking at the journal list of Scopus indexed journals I note that they include some 1365 journals open access journal which have a ScimagoJR or SNIP.

For the current table I matched the journal list from DOAJ downloaded on December 13th 2010, with the deduplicated list of the JCR 2009 indexed journals. This journal set of 619 journals was matched against the journal list from journalmetrics.com to include the ScimagoJR 2009 and SNIP2009 as well. For each journal the subject categories indicated by DOAJ were included. The journals were sorted alphabetically on subjects and descending IF within a subject. For the following table journals with multiple subject assignments in DOAJ were included in their different categories as well. This expanded the list to 782 lines. Finally the column with impact factors was removed, showing only the ScimagoJR and SNIP for the journals. A few journals were not assigned a ScimagoJR or SNIP, but these were assigned a Journal Impact Factor. In some cases this was due to differences in journal coverage between Scopus and Web of Science, but in a few cases this appears also the problem of different ISSN assignments by the respective databases.

Download: List of open access journals that are assigned an Impact Factor in the JCR 2009 showing their respective SNIP and ScimagoJR for 2009.

Have fun with this list

References

Giglia, E. (2010). The Impact Factor of Open Access journals: data and trends. ELPUB 2010 International Conference on Electronic Publishing, Helsinki (Finland), 16-18 June 2010. http://dhanken.shh.fi/dspace/bitstream/10227/599/72/2giglia.pdf and http://hdl.handle.net/10760/14666.

McVeigh, M.E. (2004). Open Access Journals in the ISI Citation Databases: Analysis of Impact Factors and Citation Patterns A citation study from Thomson Scientific, Thomson Scientific. http://science.thomsonreuters.com/m/pdfs/openaccesscitations2.pdf

Vanouplines, P. & R. Beullens (2008). De impact van open access tijdschriften. IK Intelectueel Kapitaal 7(5): 14-17. (In Dutch, Not OA available)

Possibly related posts
Another expansion of journal coverage by Thomson

2008 Journal Citation Reports figures released

Last Friday Thomson Reuters released the 2008 edition of the Journal Citation Reports. This year it was announced by Thomson itself as a news release, that’s a good move from them. The number of journals reported in the two editions of the JCR have increased from 6417 in the Science edition to 6598 (181 more journals that is) and in Social Sciences edition the number of journals covered increased from 1865 to 1980 (an increase of 115 journals).It is still not the increase I expected on the basis of the addition of some 750 new regional journals which was announced last year, and that figure is now even advertised as an expansion of 1228 journals, but it is still an expansion of 300 journals. Albeit reading Thomson’s press releases on the 2008 JCR update I still notice some juggling with numbers that don’t really add, or don’t make sense after simple investigations when comparing the 2007 and 2008 issues.Now we have to go and figure out which were added, and more important, which journal were dropped. That’s always interesting to find out. It will take time though.The really major improvement Thomson should make, is to abolish the rather odd division between the two parts of the database. Currently I can’t find any arguments to stick to the demarcation lines between the Science edition and the Social Science edition of the JCR. I really wonder how many customers they have that subscribe to only one part of the JCR. I think it is fair to assume that by far most of the customers subscribe to both parts.For teaching it is just a pain, to have to explain students that they should start their search with choosing a database part. That is far from intuitive.

Journal quality, an unexpected improvement of the JCR

It is odd to say, but for researcher the journal as an entity is disappearing. Scientist search for information in online databases and select from title and abstract information whether the article suits their needs. The days that scientists visited the library and browsed the table of contents of the most important journals to keep up with their field have long gone .

Still there is a lot of emotion around journals titles. Scientist want to publish their research in the best possible journal. Earlier this year the NOWT (2008) published a report on the performance of Dutch universities and there it was clearly shown that field normalized citation impact for each university correlated positively with the field normalized journal quality.
Journal quality versus Citation impact

Looking at this graph it is clear that there is considerable reason to selected the best journals in their field to publish your results. However, until recent the only widely available journal quality indicator has been the journal impact factor. There has been a lot of criticism on the uses and abuses of impact factors, but they have stood their time. All scientists are at least aware of impact factors. For years ISI, Thomson Reuters were in fact the sole gate keepers of journal quality rankings.

Over the last years a number of products, free and fee based, have tried to come up with new and competing journal ranking measures. SicmagoJR (based on Scopus data), journal analyzer from Scopus, Eigenfactor.org and the data from Thomson’s own Essential Science Indicators of course.

This week Thomson Reuters announced that they will update the journal citation report. From the 1st of February we get a entirely new Journal Citation Report. From the press release:

  • Five-Year Impact Factor – provides a broader range of citation activity for a more informative snapshot over time.
  • Journal “Self Citations” – An analysis of journal self citations and their contribution to the Journal Impact Factor calculation.
  • Graphic Displays of Impact Factor “Box Plots” – A graphic interpretation of how a journal ranks in different categories.
  • Rank-in-Category Tables for Journals Covering Multiple Disciplines – Allows a journal to be seen in the context of multiple categories at a glance rather than only a single one.

It is highly unusual to see two updates per year for JCR. But it is interesting to to note how they are moving under the pressure of some competition.

Literature:
NOWT (2008). Wetenschaps- en Technologie- Indicatoren 2008. Maastricht, Nederlands Observatorium van Wetenschap en Technologie (NOWT). http://www.nowt.nl/docs/NOWT-WTI_2008.pdf (in Dutch)

Thomson Reuters issues a press release on the JCR 2007

It just popped up in my RSS feed on the second of July. The official press release from Thomson Reuters is dated July 1st, announcing the new edition of the Journal Citation Reports 2007. There is no further mention of the new journals included or excluded. There is a link to the official promotional website of JCR, which still states:

  • Covers more than 7,500 of the world’s most highly cited, peer-reviewed journals in approximately 200 disciplines
  • The Science Edition covers over 5,900 leading international science journals from the Thomson Reuters database
  • The Social Sciences Edition covers over 1,700 leading international social sciences journals from the Thomson Reuters database

It actually struck me today that the journals included in JCR are not listed at their Master Journal List.

Shall we call it progress that Thomson is confirming what I blogged about some two weeks ago?

On Impact Factors and article quality

I just found this quote:

Thus, while it is incorrect to say that the impact factor gives no information about individual papers in a journal, the information is surprisingly vague and can be dramatically misleading.

(Adler et al. 2008)

The report is a very critical discussion about the use and abuse of impact factors and the h-index.

Reference:
Adler, R., J. Ewing, et al. (2008). Citation Statistics : A report from the International Mathematical Union (IMU) in cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and the Institute of Mathematical Statistics (IMS), Joint Committee on Quantitative Assessment of Research. 26p. http://www.mathunion.org/fileadmin/IMU/Report/CitationStatistics.pdf

Hattip: Sidi