How Google Scholar Citations passes the competition left and right

Google Scholar logoLast Thursday Google Scholar Citations went public. It was to be expected. Since August the product has been tested by a few (blogging) scientists. We only had to wait patiently for it to be released to all scientists. Last Thursday the moment was there.

Was it worth the wait? Yes it certainly was. Google Scholar Citations really excels at finding publications you completely forgot about. But even then, there are still –obscure- publications that even Google Scholar doesn’t know about. You simply log in and deselect those few publications that don’t belong to you. You can make searches to find publications that Google has overlooked. You get a comprehensive publication list quite quickly. Well when your name is not too common, that is. How it works for very common names, Korean scientists jump to my mind as well as John Smith, I don’t know yet. But so far nothing new, Ann-Will Harzing’s excellent Publish or Perish software already did this. What is new is the fact that Google Scholar Citations keeps the citations and publications automatically up to data and allows you to publish your own publication list on the Web with the citations and some crude citations metrics.

The two major competitors in this arena are Thomson Reuters with their ResearcherID and Elsevier’s Scopus which has their Scopus ID. With both services you can identify your own publications and assign them to a unique number. IN this way you can create your unique publications list with citation metrics as well. The main disadvantage compared to Google Scholar is their rather limited resource set. Thomson Reuters WoS “only” covers some 10,000 scholarly journals a set of selected proceedings and of recent only 30,000 books. Scopus has nearly double the number of journals but stays behind in proceedings and covers hardly any books. Google Scholar certainly covers more, but we still don’t understand what is included and what not and sometimes have our doubts about currentness of Google Scholar. The larger resource base, including books and book chapters, of Google Scholar makes will make this service more attractive for social scientist and scholars in arts and humanities studies.

On top of the smaller publication base on which these services are based, these two competitors each have their own particular disadvantage as well. You have to maintain you publications list in Thomson Reuters Researcher ID yourself manually. Each time you publish a new article, you have to add it to your profile yourself. Looking around, I see that most researchers are a bit sloppy in this respect. You can however, make your publication list and the citation impact publically available. see for example my meagre list. Scopus on the other hand, maintains your publication list automatically (albeit it made some serious mistakes in this area in the past, but they seem to have improved this service). But, and this is a big but, you can’t publish you properly curated publication list with citations publically on the Web. They used to have 2Collab for this, but since they stopped 2Collab they haven’t come up with an alternative mechanism to publish your publications list with citation impact on a public website. A real pity.

So Google Scholar easily beats ResearcherID since it updates automatically and Scopus ID because you can make your list with citations publically available. To make your publication list openly available is really recommended to all scientists, it helps your personal branding.

Certainly there are disadvantages to Google Scholar aswell. The most serious at this moment all kind of ghost citations. If you look at the citations to our bibliometrics analysis on top of repositories paper, Google counts three citations. But checking the Leydesdorff citations, a reference to our article is not to be found (of course it should have been there, but it isn’t). 0xDE reported a spam account in the name of Peter Taylor, where they collected various Taylors in a single profile boasting an h-index of 94. That Google Scholar can be fooled has been reported Beel & Grip (2010).

When I was interviewed for our university paper on Google Scholar Citations (in Dutch) I told them: Google Scholar is only about five years old. Give them another five years and they will have changed the market for abstracting and indexing database totally. If only 20 percent of all scientists make their publication lists correct (also editing of the references which can be done to improve the mistakes Google has made) even without making them publically available, Google sits on a treasure trove of high quality metadata. Really interesting to see how this story will develop.

Reference:
Joeran Beel and Bela Gipp. Academic search engine spam and google scholar’s resilience against it. Journal of Electronic Publishing, 13(3), December 2010.

Which master journal list do you prefer?

A very useful resource which I need to consult, say, twice a year is master journal list of Thomson Reuters Scientific. This morning it was actully a colleage who needed this resource. Actually he wanted to know the journals covered by Web of Science. So he needed a subset of the Master Journal List. I knew that existsed but where?

Using Google we ended up on this version of the Master Journal List. Not the one I really wanted since it did not have the datase specific lists. I knew it existed but where? Only a couple of hours later, by approaching the site from a different angle, navigating around a wee bit more different I found the version of the Master Journal List, the version we were actually looking for.

Looking carefully I finally see that the first one is a more extensive journal search form of the Master Journal List. But that you can only find out after you’ve found the second website. You can navigate from the one to the other, but not the other way around. Little bit strange. Let alone confusing.

Actually in a similar vein. Thomson has a brand new product InCites, whereas the old totally different In-Cites website/product from the same company still exists.

2008 Journal Citation Reports figures released

Last Friday Thomson Reuters released the 2008 edition of the Journal Citation Reports. This year it was announced by Thomson itself as a news release, that’s a good move from them. The number of journals reported in the two editions of the JCR have increased from 6417 in the Science edition to 6598 (181 more journals that is) and in Social Sciences edition the number of journals covered increased from 1865 to 1980 (an increase of 115 journals).It is still not the increase I expected on the basis of the addition of some 750 new regional journals which was announced last year, and that figure is now even advertised as an expansion of 1228 journals, but it is still an expansion of 300 journals. Albeit reading Thomson’s press releases on the 2008 JCR update I still notice some juggling with numbers that don’t really add, or don’t make sense after simple investigations when comparing the 2007 and 2008 issues.Now we have to go and figure out which were added, and more important, which journal were dropped. That’s always interesting to find out. It will take time though.The really major improvement Thomson should make, is to abolish the rather odd division between the two parts of the database. Currently I can’t find any arguments to stick to the demarcation lines between the Science edition and the Social Science edition of the JCR. I really wonder how many customers they have that subscribe to only one part of the JCR. I think it is fair to assume that by far most of the customers subscribe to both parts.For teaching it is just a pain, to have to explain students that they should start their search with choosing a database part. That is far from intuitive.

What’s inside In-Cites?

The predecessor of Thomson Reuters Scientific has been responsible, for years already, for publishing the good old in-cites website. Today I was alerted on a new service by the same company. Incites?! A brand new product? Incites it is.

For me a bit confusing. Even today when I go to the old incites site a arrive here In-Cites. Okay. It carries the warning that the site has effecitvely moved to ScienceWatch.com. (In the unnoticable red bar at the top of the page). Fair enough. But the sole reason for me to use that website, or refer to in-cites is the journal lists. Follow the trail to the methdology section in Sciencewatch there you find a link to the journal list. With an additional click you end up here, where is stated:

The current Journal List is located on the archived in-cites.com Web site.

So you end up at in-cites.

What’s new at in-cites? Or what marketeer has thought up an old name from the same company for a new product?

I am interested in the new product, but at the moment I find it all a bit confusing.

Journal quality, an unexpected improvement of the JCR

It is odd to say, but for researcher the journal as an entity is disappearing. Scientist search for information in online databases and select from title and abstract information whether the article suits their needs. The days that scientists visited the library and browsed the table of contents of the most important journals to keep up with their field have long gone .

Still there is a lot of emotion around journals titles. Scientist want to publish their research in the best possible journal. Earlier this year the NOWT (2008) published a report on the performance of Dutch universities and there it was clearly shown that field normalized citation impact for each university correlated positively with the field normalized journal quality.
Journal quality versus Citation impact

Looking at this graph it is clear that there is considerable reason to selected the best journals in their field to publish your results. However, until recent the only widely available journal quality indicator has been the journal impact factor. There has been a lot of criticism on the uses and abuses of impact factors, but they have stood their time. All scientists are at least aware of impact factors. For years ISI, Thomson Reuters were in fact the sole gate keepers of journal quality rankings.

Over the last years a number of products, free and fee based, have tried to come up with new and competing journal ranking measures. SicmagoJR (based on Scopus data), journal analyzer from Scopus, Eigenfactor.org and the data from Thomson’s own Essential Science Indicators of course.

This week Thomson Reuters announced that they will update the journal citation report. From the 1st of February we get a entirely new Journal Citation Report. From the press release:

  • Five-Year Impact Factor – provides a broader range of citation activity for a more informative snapshot over time.
  • Journal “Self Citations” – An analysis of journal self citations and their contribution to the Journal Impact Factor calculation.
  • Graphic Displays of Impact Factor “Box Plots” – A graphic interpretation of how a journal ranks in different categories.
  • Rank-in-Category Tables for Journals Covering Multiple Disciplines – Allows a journal to be seen in the context of multiple categories at a glance rather than only a single one.

It is highly unusual to see two updates per year for JCR. But it is interesting to to note how they are moving under the pressure of some competition.

Literature:
NOWT (2008). Wetenschaps- en Technologie- Indicatoren 2008. Maastricht, Nederlands Observatorium van Wetenschap en Technologie (NOWT). http://www.nowt.nl/docs/NOWT-WTI_2008.pdf (in Dutch)