The mysterious ways of Web of Science

A while back, one of our researchers asked me how Steven Salzberg arrived at the number of citations for the paper on the Arabidopsis genome in Nature. When he checked Web of Science, it only delivered zero citations and that couldn’t be true for such a breakthrough paper. Peter found 2689 citations! How did he do that?

I checked out the paper in Web of Science myself first as well, and found also zero citations.

Zero citations from Web of Science for the Arabidopsis papers

I was not entirely surprised since I realized it was one of those consortium papers. I knew Thomson had some problems with a consortium paper in the past. But annoying it was.

I first checked about the issue around the human genome project and found it being mentioned even in Science Watch from Thomson. But from the article it appeared that Thomson only improved the tracking for citations from that Human Genome project paper, and not the raised issue per se. Even though the Arabidopdsis paper was even older the citations to this paper had not been corrected. It appeared that something in the searching or tracking of citations by WoS went wrong but where was the error being made?

I made a few futile attempts in the cited ref search with Arabidopsis as author, or Arabidopsis*. Searched in the cited ref search for Kaul as author (which is listed in the end of the original article as first author) but that only resulted in some 130 citations. Not sufficient to justify Steven Salzberg number of citations. I did not like to use the cited ref search to look for the cited articles from Nature in 2000 this is a very large result set that you have to wade through innumerable pages of results since you can’t refine these type of searches by volume or page numbers. (Wouldn’t that be nice?)

To reassure my inquisitive researcher I pointed him to Scopus (Sorry Thomson) where the he could see a reassuring 3000+ citations himself. Meanwhile I did not have a quick fix for this problem.

It was only later when I looked into the problem again, and somehow I was forwarded to the all databases search rather than the Web of Science search tab, which I normally use. To my utter amazement the title search delivered this time two records. Both with zero citations, but more importantly it showed next to [Anon] Ar Gen In, as the author.

Zero citations from Web of Science for the Arabidopsis papers

Now the problem was simple. I had found the author. A cited ref search yielded indeed nearly the 2689 citations from Steven Salzberg.

Zero citations from Web of Science for the Arabidopsis papers

But these figures are not entirely correct either since there are some additional 131 citations to be found with Kaul as a first author reference to Nature with the correct volume and page number.

Of course I requested at Web of Science a correction of the citation data, but forgot to include Kaul’s citations. Hopefully this will be repaired at a later date.

But what makes me really wondering is the slight -but very important- difference in record presentation between the All Databases search and the Web of Science search  on Web of Knowledge. For me personally the standard entry in Web of Knowledge is the Web of Science tab. Not in my normal working routine would I ever go to the all databases tab to look up a number of citations. Just by luck I found the right author name on this occasion. But it shouldn’t have to become the standard way to perform searches shouldn’t it?

Research management and research quality

Blogging on Peer-Reviewed ResearchResearch performed at our universities is nowadays a heavily directed practice. Top down in most cases. Research for the sake of research has become a rare phenomenon. Research evaluations, research management and research organization are weeding out little pet projects on the side. Grant money and research funders are requesting concrete results of achievements and determine the objectives to be completed in advance.

It is therefore rather odd that in such a strongly organized and managed environment the organization of research itself is less subject of the academic discourse. I still remember my old professor who once insisted that “we didn’t need knowledge management since we produced knowledge”. That whilst after each completed PhD project another successful candidate left the organization with his knowledge written down in a number of articles and very seldom made explicit within the organization. That did not matter too much to him.

The researchers, research groups and graduate schools at universities in the Netherlands are regularly evaluated by external peer reviews. Productivity, Quality, Relevance and Vitality of the research are the main criteria on which groups are judged. It is odd however that very little study has been made of the underlying explanatory factors of successful groups versus less successful groups. I was therefore pleasantly surprised by an article of van der Weijden et al. (2008) who looked into some aspects of managerial control of research groups on their research performance.

An important shortcoming of their study was that the only bibliometric parameter they looked at was the number of papers produced in the journals covered by Web of Science. It really would have been useful if they had looked at normalized citation impact as one of their variables as well. Apart from the simple bibliometric measure of published peer reviewed articles they also looked at the success of the groups at the attainment of research grants etc.

Their most important finding was that:

“One internal research management activity was found to have a positive relationship with (bio)medical research performance in general. Offering special commendations to (bio)medical (both preclinical and clinical) research staff members, including non-financial prizes, in order to motivate them is positively related to all performance measures used in this study.”

Or in other words positive attention from the senior managers for what researchers were up to paid off really well.

From the more detailed conclusions another one struck me as very interesting as well:

“Different types of internally organized research evaluation practices have (linear) positive relationships with performance measures concerning external research funding. In preclinical groups pre-evaluations of research proposals have a positive relationship with these performance measures. Interestingly, in clinical groups, positive relationships are found with research output evaluations.”

Where in practice the external peer reviews are most often met with some degree of resistance. Well, criticism at least. It seems to be worth the effort invested by all participants into these kind of exercises.

Always good to realize this when our library is involved in the preparation of the peer review of six different graduate schools which involve about 1000 permanent staff and some 3000 researchers in total.

Reference:

van der Weijden, I., D. de Gilder, P. Groenewegen & E. Klasen. (2008). Implications of managerial control on performance of Dutch academic (bio)medical and health research groups Research Policy 37(9): 1616-1629 http://dx.doi.org/10.1016/j.respol.2008.06.007 (subscription required).

Trends in the science and information world

Tomorrow I have to teach a class for better searching for scientific information on the world wide web. In the introduction I try to highlight the major trends in research and the information landscape. I came up with the two following bullet lists.

Trends in science and research

  • Increased multidisciplinarity
  • Increasing cooperation between scientists
  • Internationalization of research
  • Need for primary data
  • More competition for same grant money

Trends in the information world

  • Increased importance of free web resources
  • From information scarcity to overload
  • After A&I databases, journal currently digitization of books
  • From bibliographic control to fulltext search
  • Open Access & Source
  • Multiformity of resources
  • User in control

I wondered if anybody has some additional suggestions for either one of these lists.