SJR : Scimago Journal & Country Rank

Sometimes you find these real gems. WoW, fantastic.

This evening I had this exciting feeling when I saw SJR for the first time. Tipped of by Recherchen Blog I stumbled upon Scimago. A database that provides a plethora of bibliometric indicators for journals and research performance at a country level. They have developed their own Pagerank (from Google) type of indicator for journal ranking called SJR indicator. But the data they provide is much more than only this indicator. Articles, citations and citations per article are provided as well.

This database is based on data provided by Scopus, which covers a much larger dataset than Journal Citation Reports or the Essential Science Indicators from Thomson Scientific. Very interesting to observe that SJR is freely available on the Web. This is a new development in the competition that is taking place between the two publishing giants Elsevier and Thomson.

The information contained in SJR is so overwhelming that it will take some time before I fully comprehend the possibilities of this database. To understand the new indicators and to make comparisons with the old established databases. The systems provides really nice graphics for journal data as well. The makers of SJR are really serious about their research, they recently published a study in Scientometrics some of their analyses with this database -on my pile of stuff to read-.

Noted some mention of SJR at Sidi and DigitalKoans as well. In the Spanish blogosphere the rumour has been spreading for some time already.

This database will certainly be covered in more detail at a later date.

Moya-Anegón, F. d., Z. Chinchilla-Rodríguez, B. Vargas-Quesada, E. Corera-Álvarez, F. J. Muñoz-Fernández, A. González-Molina & V. Herrero-Solana (2007). Coverage analysis of Scopus: A journal metric approach. Scientometrics 73(1): 53-78.

Reprise : Impact factors calculated with Scopus compared to JCR Did I report yesterday on the first preprint article that compared Impact factors calculated with JCR and Scopus, later that day a second journal was published on e-lis covering the same subject. Gorraiz and Schoegl (2007) took the analysis really a step further than Pislyakov (2007). Not only did they include a larger set of journals in their sample 100 compared to 20, they also looked at the other bibliometric indicator the immediacy index.

Interesting is the determination of the authors to look for journals in the chosen subject area, pharmacology, that were not included in the JCR but should have been there on the basis of their citations. In the journal selection process of Thomson some other factors are taken into account, but in practice we expect all top journals in a certain category to be included in the JCR/WoS database. So it is interesting to learn that there are a number of journals that should have been included on the basis of citation data in the databases of Thomson.

At the beginning of the article the authors state:

Since there are more journals included in Scopus than in WoS, a journal in Scopus has a higher cace to get cited in general. Therefore the the values for the impact factor and the immediacy index should also be higher in Scopus

This might sound plausible, but in actual fact the effect of a larger journal base is much smaller. Because Web of Science already covers virtually all top journals in the subject category they also cover the journals where most citations take place. Outside the top journals relatively little citation traffic takes place. This has been demonstrated by Ioannidis (2006) and is also indicated in journal selection policy of Thomson where they refer to some of their own research:

More recently, an analysis of 7,528 journals covered in the 2005 JCR® revealed that as few as 300 journals account for more than 50% of what is cited and more than 25% of what is published in them. A core of 3,000 of these journals accounts for about 75% of published articles and over 90% of cited articles.

What really is disturbing from both the articles of Gorraiz and Schoegl (2007) and Pislyakov (2007) is that both databases are not one hundred percent reliable when it comes to number of article published in a given year. For Scopus there we can expect some minor discrepancies since we are dealing with a young database that shows still some fluctuations in content. Elsevier still has some work to do. For WoS it is sometimes just sloppiness in indexing and that is unforgivable.

Gorraiz, J. & C. Schloegl (2007). A bibliometric analysis of pharmacology and pharmacy journals: Scopus versus Web of Science. Journal of Information Science 00(00): 00-00.
Ioannidis, J. P. A. (2006). Concentration of the Most-Cited Papers in the Scientific Literature: Analysis of Journal Ecosystems. PLoS ONE 1(1): e5.
Pislyakov, V. (2007). Comparing two “thermometers”: Impact factors of 20 leading economic journals according to Journal Citation Reports and Scopus. E-Lis.

Impact factors calculated with Scopus compared to JCR

You only had to wait for it. With the rich resource of citation data available in Scopus, somebody was going use it and calculate Impact Factors. Quantitative journal evaluations was once the single domain of Thomson Scientific (formerly ISI) but nowadays they face more and more competition. Elsevier, with Scopus, has so far hesitated to step into the arena of journal evaluation, but Vladimir Pislyakov (2007) has made a start for the 20 top journals in economics.
He compared the Impact factor from the JCR with the Impact he construed for the same journals with citation data from Scopus. In his methodology he made small mistake by not excluding the non citable items, which is quite easy to do in Scopus. But this will not invalidate his results. What was to be expected, confirming our experience with higher citations in Scopus compared to Web of Science, is that overall more citations per article were found in Scopus. This resulted in slightly higher IF as calculated by Scopus. What is more interesting is that the rankings of the journals based on Scopus data differed from the ranking based on the JCR impact factors. Overall they correlated well, but looking into detail, there was a journal that dropped from rank 5 to 13, another from 11 to 18. So there is merit to investigate this on a larger scale than those 20 journals in economics.
In the end the author makes a big mistake, he states

“Since impact factor is considered to be one of the crucial citation indicators which is widely used in research assessment and science administration, it is important to examine it critically from various points of view and investigate the environment in which it is calculated.”

Those are practices we should stay away from. The IF as such is only of interest for scientists when they select a journal for publication. IF should not be used for research evaluation of grant applications.

Pislyakov, V. (2007). Comparing two “thermometers”: Impact factors of 20 leading economic journals according to Journal Citation Reports and Scopus. E-Lis.

THES university rankings 2007

Next Friday the Times Higher Education Supplement will publish it’s famous rankings for world universities . This year they have changed the methodology quite a bit. Perhaps to counter some of the criticism on these rankings as formulated in the Wikipedia. They have made changes to the peer review, which counts for 40% in the overall ranking, and prevented the possibility of self selection of own universities by peers. They have changed the database from which they retrieve the citation data. They have selected Scopus from Elsevier above citation data from ISI (The Essential Science Indicators from Thomson Scientific that is). They have reduced the citation frame period, from ten to five years. They have attempted to make a difference between full time equivalents and number of faculty and finally they have normalized the rankings.

There are two items I like to pick out. They have selected Scopus over ESI. Quite a change. This will be less disadvantageous for countries with a strong publication culture in their own language. Think about France, Germany and all Spanish language countries, or perhaps Chinese, Japanse or Korean. The other aspect is the citation frame. I encourage a five year period over a 10 year period, but they only look at “the most recent complete 5 year window” , i.e. 2002-2006. Whereas I would prefer the period of 2001-2005 or even better 2000-2004, so all publications will have received their fair share of citations.

Meanwhile we remain, and wait for Friday to see how all these changes will affect these popular rankings.


Sowter, B. (2007) THES – QS World University Rankings 2007. QS TopUniversities.