Are there winners and losers in the VSNU-Elsevier Open Access deal?

Open Access logu
Open Access

This week it was finally announced that the Association of Dutch Universities (VSNU) and the publisher Elsevier had reached an agreement on a continuation of the Big Deal for access to all Elsevier journals combined with a transition to Open Access publishing for researchers at Dutch Universities.

In actual fact there is no deal yet. There is an “agreement in principle” and “details of this 3-year agreement, which is to start in 2016, will be finalized in the near future“. That is the reason why so few details are discussed in the accompanying Q&A.  We only know that by 2018 researchers at Dutch universities are able to publish some 30% of their articles in Open Access. How we reach this 30% is not explicated. Nor the journals that are involved in this part of the deal. Which subject areas? Hybrid journals only? Or does it include the Gold open access journals from Elsevier as well? Just a few questions that need clarification. I am not going to speculate to answer these questions.

I want to turn the attention to a quick internet poll which was held by @MsPhelps who won in this deal, Elsevier or the VSNU

In total some 59 persons did cast their vote, and the large majority (69%) voted in favour of Elsevier, the remainder for the Dutch Universities. A major problem with this poll, is that we don’t have the details yet as @HugoBesemer indicated, so how can we judge who the winners or losers are? My idea is that the Rest of the World won. The Dutch universities have shown that it is possible to strike a deal with Elsevier with major steps to Open Access publishing in toll access journals. Similar to the Springer deal in the Netherlands, which was followed bij comparable deals in the UK, Max Planck Gesellschaft and Austria, it is highly likely that this deal with Elsevier will follow in other consortia negotiations as well. Dutch Universities and Elsevier have shown way. The way to go with the big deal. Flip from a subscription based model to a  based on Author Processing Charges.

The university rankings season has commenced

It seems odd to talk about a university rankings season, but the season of the new rankings starts normally in August when the Shanghai ranking are published. There was actually no real news in the 2015 Shanghai ranking. This year VU University gained two places in the overall ranking, from 100 to 98. Small steps, but that is not strange. A major criticism at the beginning of the university rankings was the wild fluctuation of universities between years. So rankings have improved over the years, and became more stable.

The ranking season will continue with the release of the QS ranking, or top universities, on September 15th, followed by the THE World Univeristy Ranking on September 30th. It will be interesting to see if the US News best global university ranking will be released towards the end of October again. So four major global university rankings will be released in the time span of about 3 months. This year the THE ranking deserves special attention since they have changed from data providers, switching from Thomson Reuters Web of Science data to Elsevier’s Scopus, and are adapting their methodology more than the competition. The second one to watch is the US News ranking, simply because it is the second time they will be published, and it is interesting to judge the stability of this ranking.

Wasn’t there any interesting news in the Shanghai ranking?
For the careful watchers the Netherlands has now 6 universities in the top 100 of the so called alternative ranking, and VU University is the leading Dutch University at place 57.

Google Scholar profiles of Dutch Universities

Early July I checked the Google Scholar uptake at Dutch Universities. Whilst collecting the data I also noted the total number of citations for each tenth scientist with a Google Scholar profile per university. To make sense of the results I plotted them as logarithmic values against rank number of the researcher when ordered on descending total number of citations.

Total number of citations per researcher according to Google Scholar
Total number of citations per researcher according to Google Scholar

Due to the uneven uptake of Google Scholar profiles per university it is too early to draw any firm conclusions on these profiles. But at face values three groups can be distinguished: Delft, Utrecht, Wageningen, UvA, RUG and VU showing very similar profiles. The middle group with Twente, Radboud Nijmegen, Eindhoven, Erasmus Rotterdam and Leiden University. Maastricht, Tilburg and the Open University close the ranks.

Looking carefully at the leading group. Utrecht has the most researchers high number of tatal citations. Wageningen takes over from around the total citations of 500 per researchers and Delft from around 150 total citations per researcher.

I can imagine when Google Scholar profiles become more established, these kind of graphs can be used in yeat another university ranking. The number of total citations under the graphs gives some indication of the publication success of the university staff at those universities.

Full data set is available at:

Mapping the influence of humanities

David Budtz Pedersen  presented a new research proposal undertaken in Denmark, Mapping the Public Influence of the Humanities, with the aim to map the influence, meaning and value of the humanties in Denmark. His blogpost on the Impact Blog about this project generated a lot of attetion already. Even in the holiday season.

What struck me however, is that the project starts with collecting data from three different sources:

  1. names and affiliations of active scientific experts in Denmark;
  2. by documenting the educational background and research profile of the population;
  3. by downloading and collecting the full corpus of reports, whitepapers, communications, policy briefs, press releases and other written results of Danish advisory groups from 2005 to 2015.

It was the third objective of Buts Pedersen’s project that grabbed my attention: collecting the full corpus of reports, whitepapers, communications, policy briefs, press releases and other written results of Danish advisory groups from 2005 to 2015. That in the country where Atira, the makers of Pure, reside (I know currently wholly owned by Elsevier). It struck a chord with me since this is exactly what should have been done already. Assuming the influence of the humanities is a scholarly debate, all universities contributing to this debate should have an ample filled current research information systems (CRIS) filled with exactly those reports, whitepapers, communications, policy briefs, press releases et cetra.

In this post I want to concentrate on the collection side, assuming that all material collected in the CRIS is available online and free for the public at large to inspect, query and preferably -but not necesarrily- free to download. Let’s look at the collection side for a moment. Most CRIS have all kind of database coupling possibilities with major (scholarly) bibliographic databases: Web of Science, Scopus, Pubmed, Worldcat, CrossRef etc. However, those reports, whitepapers, communications, policy briefs, press releases and other written results are not normally contained in these bibliographic databases. These are the so called grey literature. Not formally published. Not formally indexed. Not easily discovered. Not easily found. Not easily collected. To collect these materials we have to ask and beg researchers to dutifully add these manually in the university CRIS.  That is exactly why universities have bought into CRIS systems. Why libraries are the ideal candidate to maintain CRIS systems. The CRIS system takes away the burden of keeping track of the formal publications through coupling with the formal bibliographic databases. Librarians have knoweldge about all these couplings and search profiles required to make life easy for the researchers. That should leave some time for researchers to devote a little of their valuable time on those other more esoteric materials. Especially in the humanities, where we apparently have more of those grey literature.  A well maintained CRIS should have plentiful of these materials registered. So I was taken aback slightly that this project in Denmark, the cradle of a major CRIS supplier, needs to collect these materials from the start. They should have been registered long time ago already. That is where the value kicks in of a comprehensive, all output inclusive CRIS, resulting in a website with a comprehensive institutional bibliography.

Just a second thought. It is odd to see that two of the major providers of CRIS systems, Thomson Reuters with Converis and Elsevier with Pure are both providers of major news information sources. It is odd that neither of these CRIS products have coupling with the proprietary news databases either Reuters or LexisNexis for press clipping and mentios in the media. From a CRIS managers’ point of view strange to make this observation since we are dealing with the same companies. But the internal company structures seem to hinder these kind seemingly logical coupling of services.