The role of university rankings in university marketing

ResearchBlogging.org So far I did not notice any proper research on the role of university rankings in relation to university marketing. Of course, I am aware of many instances that the importance of university rankings have been mentioned in this respect, but evidence to substantiate these claims are rare.

I was therefore pleasantly surprised by the research of Liang-Hsuan Chen (2008) which only passed my screen today. She found that for Asian graduate students attending Canadian universities the rankings played an important role in university selection. She found:

Graduate students enrolled in professional programs ranked factors such as the ranking of the program and affordability of tuition with high importance in choosing a Canadian graduate school. The fact that the ranking of program was ranked with the highest importance by this group of students was in part due to the availability of program ranking information and marketing efforts (e.g., the MBA Tour) undertaken by the programs.

My impression from this piece of research, whether you like it or not, rankings do play their role in the perception and choice of international students in their selection of university to complete their graduate education. Rankings have different purposes Chen explains:

Reputational ranking became a proxy for the quality of education. Although much criticized by academics for its lack of both validity and reliability, reputational ranking serves three purposes: first, it is a promotional tool for higher education institutions to recruit students; second, it is an assessing tool for international students to screen out competitive choices; and third, it is a marketing and signaling tool for students themselves after they graduate.

So it’s not only important to be present in the various University rankings. You better make sure you rank well!

References
Chen, Liang-Hsuan (2008) Internationalization or International Marketing? Two Frameworks for Understanding International Students’ Choice of Canadian Universities, Journal of Marketing For Higher Education, 18(1): 1-33, http://dx.doi.org/10.1080/08841240802100113 (Subscription required)

New webometrics ranking of world universities released

Of all possible rankings of universities that are available, the Webometrics Ranking of World Universities takes an odd place. It only looks at the website performance of the university. Their rankings have been updated somewhere earlier this month.

I have mixed feelings with their approach, but it is a prelude newer rankings than those solely based on scholarly output and impact. However I think that their approach needs more time and better tools than are available at the moment. The leading¬† researchers in this field are in the group of Mike Thelwall. Their measurements are based on their own crawlers and tools to explore, measure and investigate the academic Web. They have can understand and interpret their results completely. The Cybermetrics Lab (CINDOC) which produces the Webometrics rankings uses publicly available tools such as Yahoo!, Google and Exalead over which they don’t have control. And more importantly they don’t know whasoever how these results come about. Another problem with e.g. Google is that the number for search results are notoriously unreliable. It depends amongst others on time of day, Web Traffic, Server Load at Google and Data Center dat is being used.

So for the moment we have to take these results with a spoon full of salt rather than a pinch. It is also a question what is being measured. Take for instance the size of university Websites.  In Utrecht all staff and students appear to have personal webpages on the University Website. These are all included in the count, whether they actually contain some usefull information or not. At our University the mainstay of the indexed webpages consist of catalog records from the library. I really wonder if you really want to compare these apples and pears.

As for the measure of rich files I really wonder if they have been able to harvest all the material deposited on our repository. Looking a the statistics such as provided by OAISTER on OA harvestable documents, Wageningen University has one of the larger content rich repositories in the Netherlands. In the Webometric we are the bottom fish for this measure in the Netherlands. That we are making use of proprietary software but still adhering the OAI-PMHH protocol, of that the repository is hosted as a directory http://library.wur.nl/way should not effect the rankings as it does for the moment.

On other measures they are completely vague about the exact measure. Take for instance the Google Scholar measure. They state: “Google Scholar provides the number of papers and citations for each academic domain. These results from the Scholar database represent papers, reports and other academic items.” How do they combine publications and citations in a single measure? It is not explained. Google never gives more than the first 1000 results. How do they arrive at all citations for an institute? How did they search for the name of an institute? Did they include medical training hospitals with the University.

I do use these rankings for one point though. That is to push for the improvement of our University and Library Website wherever possible. In some aspects that is really badly needed. But I really want to take these rankings more seriously. For the moment I can’t. They have been updated again that should be the message of this post, since their blog has been defunct for quite some time already. A pity.

THES rankings, manipulation or optimization?

From the university newspaper of Groningen we get some interesting insights in the way Groningen University has optimized their data for submission to the THES rankings. Deemed not to be important, the rector nevertheless wanted Groningen University to score better in the THES-QS rankings. For the rector, the first notation in the top 200 of the THES rankings, 173 to be exactly, was a good reason to celebrate with his subordinates.

What did they do? They concentrated on the questions of the most favourable number of students. The number of PhD students was a number they could play with. In the Netherlands PhD students are most often employed as faculty, albeit they are students as well to international standards. They contemplated on the position of the researchers in the University hospital. This would increase the number of staff considerably and thus lower the student/faculty ratio, but on the other hand this could have an important effect on the number of citations per research staff as well. Increases in staff number will lower the citations per staff. Which is detrimental to the overall performance. However, if they only could guarantee that citations to hospital staff were included in the citation counts as well?

So in Groningen they have exercised through some scenarios of number of students, number of staff, student/staff ratio and citations/staff ratio to arrive at the best combination to enhance their performance. I really do wonder if the contact between Groningen and QS -the consultants establishing the rankings- did also lead to the improvement of the search for citations by including the University Hospital for the university results. It is known from research by CWTS that searches for papers from all parts of the university are notoriously difficult. Especially to include the papers produced by staff from the teaching hospitals. In Groningen they have the feeling that it helped what they did in their contacts with QS. Well, at least it resulted in a nice picture on their university profile page.

Optimization or manipulation? It is only a thin line. If you only could make sure that all staff of your university would use the proper name of the institution in the authors affiliation. The university would gain a lot.

Vrije universiteit in the top 400

In my previous post on the THES university rankings 2007, I wrote that I suspected a mix up of names of the University of Amsterdam and the Free University of Amsterdam. However this appears not to be the case. In the recently released top 400 the Vrije universiteit Amsterdam is ranked 304 in the list of top universities. We can only guess where Tilburg university is listed.

hattip: university ranking watch

Dutch universities in the Thes university rankings 2007

The Thes university rankings 2007 are now officially released. The ranking of Dutch universities is as follows with the overall rank in the THES top 200 and the previous ranking between brackets :

  1. University of Amsterdam 48(69)
  2. Delft 63(86)
  3. Leiden 84 (90)
  4. Utrecht 89(95)
  5. Maasstricht 111(172)
  6. Eindhoven 130(67)
  7. Wageningen 148(97)
  8. Erasmus 163(92)
  9. Groningen 173(232)
  10. Twente 185(115)
  11. Radboud 195(137)

All in all 11 out of 13 regular Dutch universities are enlisted in the top 200. In the case of the missing Free University I really wonder to what extend there might be a mix up of the two universities based in Amsterdam. For Tilburg it is rather unfortunate that they didn’t make it to the list but since Tilburg is mainly a humaniora and social sciences university it can be explained. As soon as I get more details, I will post more.