Of all possible rankings of universities that are available, the Webometrics Ranking of World Universities takes an odd place. It only looks at the website performance of the university. Their rankings have been updated somewhere earlier this month.
I have mixed feelings with their approach, but it is a prelude newer rankings than those solely based on scholarly output and impact. However I think that their approach needs more time and better tools than are available at the moment. The leading researchers in this field are in the group of Mike Thelwall. Their measurements are based on their own crawlers and tools to explore, measure and investigate the academic Web. They have can understand and interpret their results completely. The Cybermetrics Lab (CINDOC) which produces the Webometrics rankings uses publicly available tools such as Yahoo!, Google and Exalead over which they don’t have control. And more importantly they don’t know whasoever how these results come about. Another problem with e.g. Google is that the number for search results are notoriously unreliable. It depends amongst others on time of day, Web Traffic, Server Load at Google and Data Center dat is being used.
So for the moment we have to take these results with a spoon full of salt rather than a pinch. It is also a question what is being measured. Take for instance the size of university Websites. In Utrecht all staff and students appear to have personal webpages on the University Website. These are all included in the count, whether they actually contain some usefull information or not. At our University the mainstay of the indexed webpages consist of catalog records from the library. I really wonder if you really want to compare these apples and pears.
As for the measure of rich files I really wonder if they have been able to harvest all the material deposited on our repository. Looking a the statistics such as provided by OAISTER on OA harvestable documents, Wageningen University has one of the larger content rich repositories in the Netherlands. In the Webometric we are the bottom fish for this measure in the Netherlands. That we are making use of proprietary software but still adhering the OAI-PMHH protocol, of that the repository is hosted as a directory http://library.wur.nl/way should not effect the rankings as it does for the moment.
On other measures they are completely vague about the exact measure. Take for instance the Google Scholar measure. They state: “Google Scholar provides the number of papers and citations for each academic domain. These results from the Scholar database represent papers, reports and other academic items.” How do they combine publications and citations in a single measure? It is not explained. Google never gives more than the first 1000 results. How do they arrive at all citations for an institute? How did they search for the name of an institute? Did they include medical training hospitals with the University.
I do use these rankings for one point though. That is to push for the improvement of our University and Library Website wherever possible. In some aspects that is really badly needed. But I really want to take these rankings more seriously. For the moment I can’t. They have been updated again that should be the message of this post, since their blog has been defunct for quite some time already. A pity.