THES rankings, manipulation or optimization?

From the university newspaper of Groningen we get some interesting insights in the way Groningen University has optimized their data for submission to the THES rankings. Deemed not to be important, the rector nevertheless wanted Groningen University to score better in the THES-QS rankings. For the rector, the first notation in the top 200 of the THES rankings, 173 to be exactly, was a good reason to celebrate with his subordinates.

What did they do? They concentrated on the questions of the most favourable number of students. The number of PhD students was a number they could play with. In the Netherlands PhD students are most often employed as faculty, albeit they are students as well to international standards. They contemplated on the position of the researchers in the University hospital. This would increase the number of staff considerably and thus lower the student/faculty ratio, but on the other hand this could have an important effect on the number of citations per research staff as well. Increases in staff number will lower the citations per staff. Which is detrimental to the overall performance. However, if they only could guarantee that citations to hospital staff were included in the citation counts as well?

So in Groningen they have exercised through some scenarios of number of students, number of staff, student/staff ratio and citations/staff ratio to arrive at the best combination to enhance their performance. I really do wonder if the contact between Groningen and QS -the consultants establishing the rankings- did also lead to the improvement of the search for citations by including the University Hospital for the university results. It is known from research by CWTS that searches for papers from all parts of the university are notoriously difficult. Especially to include the papers produced by staff from the teaching hospitals. In Groningen they have the feeling that it helped what they did in their contacts with QS. Well, at least it resulted in a nice picture on their university profile page.

Optimization or manipulation? It is only a thin line. If you only could make sure that all staff of your university would use the proper name of the institution in the authors affiliation. The university would gain a lot.

Vrije universiteit in the top 400

In my previous post on the THES university rankings 2007, I wrote that I suspected a mix up of names of the University of Amsterdam and the Free University of Amsterdam. However this appears not to be the case. In the recently released top 400 the Vrije universiteit Amsterdam is ranked 304 in the list of top universities. We can only guess where Tilburg university is listed.

hattip: university ranking watch

Dutch universities in the Thes university rankings 2007

The Thes university rankings 2007 are now officially released. The ranking of Dutch universities is as follows with the overall rank in the THES top 200 and the previous ranking between brackets :

  1. University of Amsterdam 48(69)
  2. Delft 63(86)
  3. Leiden 84 (90)
  4. Utrecht 89(95)
  5. Maasstricht 111(172)
  6. Eindhoven 130(67)
  7. Wageningen 148(97)
  8. Erasmus 163(92)
  9. Groningen 173(232)
  10. Twente 185(115)
  11. Radboud 195(137)

All in all 11 out of 13 regular Dutch universities are enlisted in the top 200. In the case of the missing Free University I really wonder to what extend there might be a mix up of the two universities based in Amsterdam. For Tilburg it is rather unfortunate that they didn’t make it to the list but since Tilburg is mainly a humaniora and social sciences university it can be explained. As soon as I get more details, I will post more.

Dutch universities fall in the THES University rankings 2007

The first results of the THES universities rankings 2007 are already published on the Web, despite the embargo untill Friday. The change in methodology has quite a dramatic effect for the Dutch universities. Did we have an impressive 7 universities in the top 100 last year, in the 2007 edition only 4 remain in the top 100. Thee poll position of Dutch universities is taken over by UvA at 48 (up from 69), followed by Delft at 63 (up from 86), Leiden at 84 ( up from 90) and Utrecht at 89 (up from 89). The universities of Eindhoven, Rotterdam and Wageningen dropped out the top 100 from this league table.

It is too early to say what the exact cause of all these changes, then we should have a look at all parameters underlying this ranking. For that we have to await the official publication.

hattip: University Ranking Watch who has three stories on the new rankings, English and Canadian universities are doing exceptionally well according to URW

THES university rankings 2007

Next Friday the Times Higher Education Supplement will publish it’s famous rankings for world universities . This year they have changed the methodology quite a bit. Perhaps to counter some of the criticism on these rankings as formulated in the Wikipedia. They have made changes to the peer review, which counts for 40% in the overall ranking, and prevented the possibility of self selection of own universities by peers. They have changed the database from which they retrieve the citation data. They have selected Scopus from Elsevier above citation data from ISI (The Essential Science Indicators from Thomson Scientific that is). They have reduced the citation frame period, from ten to five years. They have attempted to make a difference between full time equivalents and number of faculty and finally they have normalized the rankings.

There are two items I like to pick out. They have selected Scopus over ESI. Quite a change. This will be less disadvantageous for countries with a strong publication culture in their own language. Think about France, Germany and all Spanish language countries, or perhaps Chinese, Japanse or Korean. The other aspect is the citation frame. I encourage a five year period over a 10 year period, but they only look at “the most recent complete 5 year window” , i.e. 2002-2006. Whereas I would prefer the period of 2001-2005 or even better 2000-2004, so all publications will have received their fair share of citations.

Meanwhile we remain, and wait for Friday to see how all these changes will affect these popular rankings.

Reference;

Sowter, B. (2007) THES – QS World University Rankings 2007. QS TopUniversities. http://www.topuniversities.com/news/article/thes_qs_world_university_rankings_2007_basic_explanation_of_key_enhancements_in_methodology_for_2/