Scimago rankings 2011 released

Today Félix de Moya Anegón announced on twitter  that the Scimago Institutional rankings (SIR) for 2011 were released. These rankings are not very well known or widely used. Yesterday during a ranking masterclass from the Dutch Association for Institutional Research the SIR was not even mentioned. Undeservedly so. Scimago lists just over 3000 institutions worldwide. It is therefore one of the most comprehensive institutional ranking. If not the most. It is also a very clear ranking they only measure publication output and impact. It thus ranks only research performance of the institutions and therefore very similar to the Leiden ranking.

What I like about Scimago, is their innovative indicators, they come up with each year. Last year they introduced the %Q1 parameter. Which is the ratio of publications that an institution publishes in the most influential scholarly journals of the world. Journals considered for this indicator are those ranked in  the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator. This year they introduced the Excellence Rate. The Excellence Rate indicates which percentage of an institution’s scientific output is included into the set formed by the 10% of the most cited papers in their respective scientific fields. It is a measure of high quality output of research institutions. Very similar indicators, the excellence indicator is just a tougher version of the %Q1.

The other new indicator is the specialization index. The Specialization Index indicates the extent of thematic concentration / dispersion of an institution’s scientific output. Values range between 0 to 1, indicating generalistic vs. specialized institutions respectively.

Their most important indicator to express research performance is their Normalized Impact (NI). Which is similar to the MNCS of the CWTS and RI as we calculate in Wageningen. The values, expressed in percentages, show the relationship of an institution’s average scientific impact and the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below average and 1.3 means the institution is cited 30% above average.

Last year the the Scimago team showed already that there is exist an exponential relationship between the ability an institution has to lead its scientific papers to better journals (%Q1) and the average impact achieved by its production in terms of Normalized Impact. It is a relationship I always show in classes on publications strategy (slides 15 and 16). When looking at the Dutch universities, I noted that the correlation between the new excellence indicator and normalized impact is even better than with the %Q1. So the pressure to publish in the absolute top journal per research field will even further increase if this become general knowledge.

What do we learn for the Dutch universities from the Scimago rankings. Rotterdam still maintains its top position for normalized impact, it scores also best for the %Q1 and Exc. Direct after Rotterdam you Leiden, UvA, VU, Utrecht and Radboud with equal impact. Utrecht has published the most articles during the period 2005-2009. Wageningen excels at international cooperation. And both Tilburg and Wageningen are the most specialized universities in the Netherlands.

Making these international rankings is quite a daunting task. For the Netherlands I noticed that the output of Nijmegen was distributed over Radboud University and Radboud University and Nijmegen Medical Centre, this was not done for the other university hospitals.  And for Wageningen the output was noted under Wageningen University and Research Centre and Plant Research International (which is part of Wageningen UR). But for researchers from Spain these are difficult nuances to resolve 100% perfectly.

My only real complaint with the ranking is the fact that they state it is not a league table, and they rank the institutions on publication output. It is so much more obvious to present the list ranked on NI. Since they only produce the ranking as a PDF file, it took me a couple of hours to translate it into an excel spreadsheet and rank the data any way I wish. With all the information at hand it is also possible design your own indicators, such as a power rank in analogy of the Leiden rankings.

The message to my researchers: aim for the best journals in you field. We still have scope for improvement. We are still not in the neighbourhood of the 30 to 40% Exc. Rate we see for Rockkefeller, Harvard and the like.

The Impact Factor of Open Access journals

In the world of Open Access publishing the golden road has received a great deal of attention. At least this is what our researchers seem to remember. Of course there are other roads to open access, but I want to present the impact factors of the journals facilitating the golden road to open access. This blogpost lists all open access journals included in DOAJ and assigned an Journal Impact Factor in the JCR 2009. The reason for this, is that our researchers see publishing in open access journals as the simplest way of achieving open access to their work, but on the other hand they are required for judgement of the citation impact that they publish in journals covered by Web of Science and therefore the Journal Citation Reports (JCR).

In the past there have been studies on citation impact of the open access journals that have actually received a journal impact factor from Thomson Reuters Scientific (formerly ISI). The first was by (McVeigh 2004) followed by (Vanouplines and Beullens 2008) (in Dutch, and not openly accessible) and recently by (Giglia 2010). These consecutive studies showed an increasing number of open access journals that received an Journal Impact Factor from Thomson Reuters. McVeigh reported 239 OA journals for the JCR 2004, Vanouplines reported 295 OA journals for the JCR 2005 and Giglia reported 385 OA journals for the JCR 2008 (there are some methodological issues that make these figures not entirely comparable).

The pitfall of these studies is that although they showed interesting figures and additional analyses, none of these studies actually published the list of open access journals that received an impact factor. The sole purpose of this blogpost is to publish this actual list. The probable reason for the previous authors is that the impact factors are proprietary information from Thomson Reuters. You are not allowed to publish these figures. On the other hand most publishers, use it in all their marketing outings for their journals. So the journal impact factor is virtually information in the public domain.

To avoid any intellectual property problems with Thomson Reuters I have included the ScimagoJR and Scopus SNIP indicator for the journals rather than the Journal Impact Factor. The correlation for this set of journals between SNIP and IF was 0.94 and between SJR and IF was 0.96. In total 619 journals from DOAJ were present in the JCR 2009 report (Science and Social Science & Humanities version deduplicated). The growth in journal coverage is due to the growth in OA journals and the significant expansion of journal coverage in 2008. On the other hand looking at the journal list of Scopus indexed journals I note that they include some 1365 journals open access journal which have a ScimagoJR or SNIP.

For the current table I matched the journal list from DOAJ downloaded on December 13th 2010, with the deduplicated list of the JCR 2009 indexed journals. This journal set of 619 journals was matched against the journal list from journalmetrics.com to include the ScimagoJR 2009 and SNIP2009 as well. For each journal the subject categories indicated by DOAJ were included. The journals were sorted alphabetically on subjects and descending IF within a subject. For the following table journals with multiple subject assignments in DOAJ were included in their different categories as well. This expanded the list to 782 lines. Finally the column with impact factors was removed, showing only the ScimagoJR and SNIP for the journals. A few journals were not assigned a ScimagoJR or SNIP, but these were assigned a Journal Impact Factor. In some cases this was due to differences in journal coverage between Scopus and Web of Science, but in a few cases this appears also the problem of different ISSN assignments by the respective databases.

Download: List of open access journals that are assigned an Impact Factor in the JCR 2009 showing their respective SNIP and ScimagoJR for 2009.

Have fun with this list

References

Giglia, E. (2010). The Impact Factor of Open Access journals: data and trends. ELPUB 2010 International Conference on Electronic Publishing, Helsinki (Finland), 16-18 June 2010. http://dhanken.shh.fi/dspace/bitstream/10227/599/72/2giglia.pdf and http://hdl.handle.net/10760/14666.

McVeigh, M.E. (2004). Open Access Journals in the ISI Citation Databases: Analysis of Impact Factors and Citation Patterns A citation study from Thomson Scientific, Thomson Scientific. http://science.thomsonreuters.com/m/pdfs/openaccesscitations2.pdf

Vanouplines, P. & R. Beullens (2008). De impact van open access tijdschriften. IK Intelectueel Kapitaal 7(5): 14-17. (In Dutch, Not OA available)

Possibly related posts
Another expansion of journal coverage by Thomson