Some musings on the JCR

Last year some of our researchers asked me what had happened to the Impact Factor of the journal Water Science and Technology. In the 2005 edition it was still included in the JCR and showed an showed an increasing trend in Impact Factor. Not the top of all journals, but a good player. After correspondence with ISI (Thomson Reuters Scientific) we found out that it was indeed excluded from the JCR because it lacked the desired quality. Later I understood from one of the editors that perhaps too many conference papers caused this problem. The editors changed the editorial policies and complied with ISI to upgrade the standards. After these improvements the journal was set for inclusion in 2007 again.

Indeed the journal has appeared again in the latest edition of the JCR. A shinning IF of 1.240 which is higher than ever. For 2006 the IF has been calculated and presented in the 2007 edition as well. A wee bit low, but it is important that there is a continous set of data. But what really amazes me is the fact that when you search in de 2006 edition of the JCR you still don’t find this journal. In the 2005 edition it is there again. It strikes me as odd. Still hanging on the old idea of a paper edition.

Another pain point of me with the JCR is the strange division between the Science edition and the Social Science edition of the JCR.Today I had to check for a set of journals their impact factors. Each time you have to guess wether the journal would be included in the Science edition or the Social Science edition.

I can imagine there is a sales argument to sell either smaller set to smaller institutions. But when you subscribe to the complete set, I can’t see any reason whatsoever why we have to live with this barrier in the database. It seems a relic from times long time gone.

JCR 2007 releases new impact factors

June is always the time to look out for the newest update of the Journal Citation Reports. Yesterday I checked and they weren’t there yet. Today the JCR was updated and included the 2007 figures.

You can leave it at that. We subscribe to this databases, and it has been updated. That’s all.

For the Journal Citation Reports which is updated only once a year that simple message will not suffice in my opinion. Only Thomson Reuters Scientific doesn’t appear to share my view. JCR is an important database. On the release of the latest figures, armies of researchers want to consult the database to see whether the journal on which editorial board they are has increased its Impact Factor. Or they use it to judge where to submit their next set of articles.

When the Essential Science Indicators are updated, once every two months. The event is accompanied with a slew of information from Thomson. When the even more popular database of JCR is updated we don’t receive any information whatsoever.

We have to find out ourselves that the coverage of journals has been expanded, growing from 6166 in the 2006 JCR Science edition to 6417 in the 2007 edition. For the JCR Social Science edition the number of journals covered increased with 97 journals to a total of 1865 journals. Which journals? We are left to guess for ourselves. Some Spanish journals they have worked out.

Thomson Reuters Scientific knows, but they haven’t told us (yet). Some journals have been dropped from the list. We only have to find out ourselves which ones. The increase in journals this year is only a prelude to the increase which we might expect next year since they have included some 700 new regional journals in Web of Science.

I might be mistaken, but at first sight there must be some interesting news worthy facts in the yearly update of JCR. Worthy of informing at least your subscribing librarians, who can on their turn inform their users. We want to inform our users on these events. We are more than willing to promote your products. Thomson, you can facilitate this work a whole lot better, but you should inform us a whole lot better than this.

Impact factors and Scimago JR compared

In December I promised to look into more detail of the newly launched Scimago Country & Journal Rank database. Scimago has attracted some attention in the blogosphere outside Spain since December and got some serious attention from Declan Butler as a news item in Nature (Subscription required).

It is too early for some thorough in-depth investigations of this new database, but the better blog reactions were at Information Research and a second time again and the Biomed Central Blog . They both had an issue of self interest to see where they where their journals were standing in this new database. We have to wait a bit longer for the reviews in the scholarly literature, I’m afraid.

Meanwhile I have looked into this database a bit more closely. In this blogpost I report some of my findings. My reason to look into this database more closely is mainly triggered by the fact that it allows us –librarians- to evaluate the rankings of a larger set of journals in a quantitative way. Impact factors have played a role in the decisions on journal subscriptions and cancellations –albeit not the sole criterion- How does the SJR compare to the impact factor is my main question.

SJR is “an indicator that expresses the number of connections that a journal receives through the citation of its documents divided between the total of documents published in the year selected by the publication, weighted according to the amount of incoming and outgoing connections of the sources.” In essence is the SJR an Pagerank type of indicator in which citations from highly ranked journals increase the ranking of the journal.

To gain more understanding SJR and I have looked at the journals in the subject category ‘Library and Information Science’. This category includes some 98 journals. It is important to note that SCImago JR has a much more refined subject categorization than included in Scopus itself. Although I speculate that this subject categorization is possibly somewhere under the hood in Scopus as well. The corresponding category in JCR is Information ‘Science & Library Science’ which contains 53 journals.

It is really easy to transfer the data from Simago JR to excel, where it always take a bit more clicks (making a marked list) and using the print export to get the data into excel. Interesting to note that in the web environment SCImago uses a European number notation with comma’s indicating the fraction and the dot indicating the thousands. On transfer to excel this is corrected automatically. A minor point from SCImago is that ISSN numbers are lacking from the exported data. In JCR the full journal titles are not exported.

The journals from JCR were matched manually against the journals from SCImago since a shared field was missing. Only a few journals from JCR were not found directly in the downloaded journals from SCImago. The journals ‘Journal of the American Medicals Information Association’, ‘Information and Management’ and ‘Journal of Scholarly Publishing’ were included in other journal categories than ‘Library and Information Science’. Furthermore it was noted that the journal ‘International Journal of the Geographical Information Science’ was included twice in the list of Library and Information Science journals at rank 5 and rank 33 again. In the processing the journal at rank 33 was dropped from the list. In the JCR the Journal of Government Information is still include albeit it was from 2005 already included in Government Information Quarterly –The calculation of IF in JCR 2006 is indeed based on only a single year of data-. Two other journals Online and Econtent included in JCR and included in Scopus were not to be found in SCImago. This is not really a great miss, since these are trade journals rather than peer reviewed scholarly journals, but this applies to some other journals included in the table as well, e.g. The Scientist and Library Journal. In the end 50 journals from SCImago and JCR in the LIS field could be matched. The full list of journals included in this little study is linked as a Google Document.

Looking at the table it is apparent that the maximum value of SJR is an order of magnitude smaller than the impact Factors. At the lower en of the scale Impact factors become zero, whereas the lowest value of SJR in this set of journals is 0.038.
In Figure 1, I have plotted the IF against the SJR. There seems to be a strong relationship between SJR and IF, albeit there are some outliers from an apparent linear relationship. Interestingly these three outliers are LIS journals on medical librarianship, they are: Journal of the American Medical Informatics Association : JAMIA, Journal of Health Communication and Journal of the Medical Library Association. MIS Quarterly is not regarded as an outlier since it clear follows lies on the trendline underlying the other datapoints.

Figure 1

I think the three outliers really illustrate the point that SJR is more a pagerank type of indicator. The three medically oriented journals receive relatively citations from highly ranked medical journals. Checking this for JAMIA in Scopus, we find citations from journals such as Pediatrics (SJR=0.528), Annals of Internal Medicine (SJR= 1.127) or BMC Bioinformatics (SJR= 0.957). The journal adhering the trendline for LIS journals receive far less of these kind of “external” citations.

Excluding the three medical journals we get a very good regression between the two parameters with an R² of 0.86. In Figure 2 the regression line is added based on the remaining 47 journals.

Figure 2

Thought this is a really cool result illustrating the difference between SJR and IF quite clearly. In a subsequent post I will look a bit more into the correlations between the various parameters a bit more.

Reprise : Impact factors calculated with Scopus compared to JCR Did I report yesterday on the first preprint article that compared Impact factors calculated with JCR and Scopus, later that day a second journal was published on e-lis covering the same subject. Gorraiz and Schoegl (2007) took the analysis really a step further than Pislyakov (2007). Not only did they include a larger set of journals in their sample 100 compared to 20, they also looked at the other bibliometric indicator the immediacy index.

Interesting is the determination of the authors to look for journals in the chosen subject area, pharmacology, that were not included in the JCR but should have been there on the basis of their citations. In the journal selection process of Thomson some other factors are taken into account, but in practice we expect all top journals in a certain category to be included in the JCR/WoS database. So it is interesting to learn that there are a number of journals that should have been included on the basis of citation data in the databases of Thomson.

At the beginning of the article the authors state:

Since there are more journals included in Scopus than in WoS, a journal in Scopus has a higher cace to get cited in general. Therefore the the values for the impact factor and the immediacy index should also be higher in Scopus

This might sound plausible, but in actual fact the effect of a larger journal base is much smaller. Because Web of Science already covers virtually all top journals in the subject category they also cover the journals where most citations take place. Outside the top journals relatively little citation traffic takes place. This has been demonstrated by Ioannidis (2006) and is also indicated in journal selection policy of Thomson where they refer to some of their own research:

More recently, an analysis of 7,528 journals covered in the 2005 JCR® revealed that as few as 300 journals account for more than 50% of what is cited and more than 25% of what is published in them. A core of 3,000 of these journals accounts for about 75% of published articles and over 90% of cited articles.

What really is disturbing from both the articles of Gorraiz and Schoegl (2007) and Pislyakov (2007) is that both databases are not one hundred percent reliable when it comes to number of article published in a given year. For Scopus there we can expect some minor discrepancies since we are dealing with a young database that shows still some fluctuations in content. Elsevier still has some work to do. For WoS it is sometimes just sloppiness in indexing and that is unforgivable.

Gorraiz, J. & C. Schloegl (2007). A bibliometric analysis of pharmacology and pharmacy journals: Scopus versus Web of Science. Journal of Information Science 00(00): 00-00.
Ioannidis, J. P. A. (2006). Concentration of the Most-Cited Papers in the Scientific Literature: Analysis of Journal Ecosystems. PLoS ONE 1(1): e5.
Pislyakov, V. (2007). Comparing two “thermometers”: Impact factors of 20 leading economic journals according to Journal Citation Reports and Scopus. E-Lis.

Impact factors calculated with Scopus compared to JCR

You only had to wait for it. With the rich resource of citation data available in Scopus, somebody was going use it and calculate Impact Factors. Quantitative journal evaluations was once the single domain of Thomson Scientific (formerly ISI) but nowadays they face more and more competition. Elsevier, with Scopus, has so far hesitated to step into the arena of journal evaluation, but Vladimir Pislyakov (2007) has made a start for the 20 top journals in economics.
He compared the Impact factor from the JCR with the Impact he construed for the same journals with citation data from Scopus. In his methodology he made small mistake by not excluding the non citable items, which is quite easy to do in Scopus. But this will not invalidate his results. What was to be expected, confirming our experience with higher citations in Scopus compared to Web of Science, is that overall more citations per article were found in Scopus. This resulted in slightly higher IF as calculated by Scopus. What is more interesting is that the rankings of the journals based on Scopus data differed from the ranking based on the JCR impact factors. Overall they correlated well, but looking into detail, there was a journal that dropped from rank 5 to 13, another from 11 to 18. So there is merit to investigate this on a larger scale than those 20 journals in economics.
In the end the author makes a big mistake, he states

“Since impact factor is considered to be one of the crucial citation indicators which is widely used in research assessment and science administration, it is important to examine it critically from various points of view and investigate the environment in which it is calculated.”

Those are practices we should stay away from. The IF as such is only of interest for scientists when they select a journal for publication. IF should not be used for research evaluation of grant applications.

Pislyakov, V. (2007). Comparing two “thermometers”: Impact factors of 20 leading economic journals according to Journal Citation Reports and Scopus. E-Lis.