Mapping the influence of humanities

David Budtz Pedersen  presented a new research proposal undertaken in Denmark, Mapping the Public Influence of the Humanities, with the aim to map the influence, meaning and value of the humanties in Denmark. His blogpost on the Impact Blog about this project generated a lot of attetion already. Even in the holiday season.

What struck me however, is that the project starts with collecting data from three different sources:

  1. names and affiliations of active scientific experts in Denmark;
  2. by documenting the educational background and research profile of the population;
  3. by downloading and collecting the full corpus of reports, whitepapers, communications, policy briefs, press releases and other written results of Danish advisory groups from 2005 to 2015.

It was the third objective of Buts Pedersen’s project that grabbed my attention: collecting the full corpus of reports, whitepapers, communications, policy briefs, press releases and other written results of Danish advisory groups from 2005 to 2015. That in the country where Atira, the makers of Pure, reside (I know currently wholly owned by Elsevier). It struck a chord with me since this is exactly what should have been done already. Assuming the influence of the humanities is a scholarly debate, all universities contributing to this debate should have an ample filled current research information systems (CRIS) filled with exactly those reports, whitepapers, communications, policy briefs, press releases et cetra.

In this post I want to concentrate on the collection side, assuming that all material collected in the CRIS is available online and free for the public at large to inspect, query and preferably -but not necesarrily- free to download. Let’s look at the collection side for a moment. Most CRIS have all kind of database coupling possibilities with major (scholarly) bibliographic databases: Web of Science, Scopus, Pubmed, Worldcat, CrossRef etc. However, those reports, whitepapers, communications, policy briefs, press releases and other written results are not normally contained in these bibliographic databases. These are the so called grey literature. Not formally published. Not formally indexed. Not easily discovered. Not easily found. Not easily collected. To collect these materials we have to ask and beg researchers to dutifully add these manually in the university CRIS.  That is exactly why universities have bought into CRIS systems. Why libraries are the ideal candidate to maintain CRIS systems. The CRIS system takes away the burden of keeping track of the formal publications through coupling with the formal bibliographic databases. Librarians have knoweldge about all these couplings and search profiles required to make life easy for the researchers. That should leave some time for researchers to devote a little of their valuable time on those other more esoteric materials. Especially in the humanities, where we apparently have more of those grey literature.  A well maintained CRIS should have plentiful of these materials registered. So I was taken aback slightly that this project in Denmark, the cradle of a major CRIS supplier, needs to collect these materials from the start. They should have been registered long time ago already. That is where the value kicks in of a comprehensive, all output inclusive CRIS, resulting in a website with a comprehensive institutional bibliography.

Just a second thought. It is odd to see that two of the major providers of CRIS systems, Thomson Reuters with Converis and Elsevier with Pure are both providers of major news information sources. It is odd that neither of these CRIS products have coupling with the proprietary news databases either Reuters or LexisNexis for press clipping and mentios in the media. From a CRIS managers’ point of view strange to make this observation since we are dealing with the same companies. But the internal company structures seem to hinder these kind seemingly logical coupling of services.

 

Which master journal list do you prefer?

A very useful resource which I need to consult, say, twice a year is master journal list of Thomson Reuters Scientific. This morning it was actully a colleage who needed this resource. Actually he wanted to know the journals covered by Web of Science. So he needed a subset of the Master Journal List. I knew that existsed but where?

Using Google we ended up on this version of the Master Journal List. Not the one I really wanted since it did not have the datase specific lists. I knew it existed but where? Only a couple of hours later, by approaching the site from a different angle, navigating around a wee bit more different I found the version of the Master Journal List, the version we were actually looking for.

Looking carefully I finally see that the first one is a more extensive journal search form of the Master Journal List. But that you can only find out after you’ve found the second website. You can navigate from the one to the other, but not the other way around. Little bit strange. Let alone confusing.

Actually in a similar vein. Thomson has a brand new product InCites, whereas the old totally different In-Cites website/product from the same company still exists.

2008 Journal Citation Reports figures released

Last Friday Thomson Reuters released the 2008 edition of the Journal Citation Reports. This year it was announced by Thomson itself as a news release, that’s a good move from them. The number of journals reported in the two editions of the JCR have increased from 6417 in the Science edition to 6598 (181 more journals that is) and in Social Sciences edition the number of journals covered increased from 1865 to 1980 (an increase of 115 journals).It is still not the increase I expected on the basis of the addition of some 750 new regional journals which was announced last year, and that figure is now even advertised as an expansion of 1228 journals, but it is still an expansion of 300 journals. Albeit reading Thomson’s press releases on the 2008 JCR update I still notice some juggling with numbers that don’t really add, or don’t make sense after simple investigations when comparing the 2007 and 2008 issues.Now we have to go and figure out which were added, and more important, which journal were dropped. That’s always interesting to find out. It will take time though.The really major improvement Thomson should make, is to abolish the rather odd division between the two parts of the database. Currently I can’t find any arguments to stick to the demarcation lines between the Science edition and the Social Science edition of the JCR. I really wonder how many customers they have that subscribe to only one part of the JCR. I think it is fair to assume that by far most of the customers subscribe to both parts.For teaching it is just a pain, to have to explain students that they should start their search with choosing a database part. That is far from intuitive.

What’s inside In-Cites?

The predecessor of Thomson Reuters Scientific has been responsible, for years already, for publishing the good old in-cites website. Today I was alerted on a new service by the same company. Incites?! A brand new product? Incites it is.

For me a bit confusing. Even today when I go to the old incites site a arrive here In-Cites. Okay. It carries the warning that the site has effecitvely moved to ScienceWatch.com. (In the unnoticable red bar at the top of the page). Fair enough. But the sole reason for me to use that website, or refer to in-cites is the journal lists. Follow the trail to the methdology section in Sciencewatch there you find a link to the journal list. With an additional click you end up here, where is stated:

The current Journal List is located on the archived in-cites.com Web site.

So you end up at in-cites.

What’s new at in-cites? Or what marketeer has thought up an old name from the same company for a new product?

I am interested in the new product, but at the moment I find it all a bit confusing.

Journal quality, an unexpected improvement of the JCR

It is odd to say, but for researcher the journal as an entity is disappearing. Scientist search for information in online databases and select from title and abstract information whether the article suits their needs. The days that scientists visited the library and browsed the table of contents of the most important journals to keep up with their field have long gone .

Still there is a lot of emotion around journals titles. Scientist want to publish their research in the best possible journal. Earlier this year the NOWT (2008) published a report on the performance of Dutch universities and there it was clearly shown that field normalized citation impact for each university correlated positively with the field normalized journal quality.
Journal quality versus Citation impact

Looking at this graph it is clear that there is considerable reason to selected the best journals in their field to publish your results. However, until recent the only widely available journal quality indicator has been the journal impact factor. There has been a lot of criticism on the uses and abuses of impact factors, but they have stood their time. All scientists are at least aware of impact factors. For years ISI, Thomson Reuters were in fact the sole gate keepers of journal quality rankings.

Over the last years a number of products, free and fee based, have tried to come up with new and competing journal ranking measures. SicmagoJR (based on Scopus data), journal analyzer from Scopus, Eigenfactor.org and the data from Thomson’s own Essential Science Indicators of course.

This week Thomson Reuters announced that they will update the journal citation report. From the 1st of February we get a entirely new Journal Citation Report. From the press release:

  • Five-Year Impact Factor – provides a broader range of citation activity for a more informative snapshot over time.
  • Journal “Self Citations” – An analysis of journal self citations and their contribution to the Journal Impact Factor calculation.
  • Graphic Displays of Impact Factor “Box Plots” – A graphic interpretation of how a journal ranks in different categories.
  • Rank-in-Category Tables for Journals Covering Multiple Disciplines – Allows a journal to be seen in the context of multiple categories at a glance rather than only a single one.

It is highly unusual to see two updates per year for JCR. But it is interesting to to note how they are moving under the pressure of some competition.

Literature:
NOWT (2008). Wetenschaps- en Technologie- Indicatoren 2008. Maastricht, Nederlands Observatorium van Wetenschap en Technologie (NOWT). http://www.nowt.nl/docs/NOWT-WTI_2008.pdf (in Dutch)