Scoring universities is flawed yet still highly influential
Rankings always feel a bit like Whose line is it anyway? – everything is made up and the points don’t matter. Or do they? Last month, the QS World University Rankings, arguably the go-to place for university ranks, came out. And, if you’re at the Massachusetts Institute of Technology, Berkeley, Cambridge, Harvard or Oxford, good news – you’re at one of the best universities in the world for chemistry.
This is hardly a revelation, and the exact placing is unlikely to impact on a career. No one will bemoan ‘Oh, your degree is only from Harvard? Well, that’s no good, it’s only the fourth best in the world.’ Nor do these rankings matter when it comes to survival. Berkeley was under threat of closure only last year, a decision based on costs, not its supposed position as the second best chemistry department in the world.
Any scientist can pick holes in the QS system. Its scores are based on four factors: academic reputation and employer reputation are subjective, while citations per paper and h-index citations are measures likely to drive pressure to publish. It’s also pigeon-holing different courses on offer: Cambridge students study chemistry as part of a natural sciences degree, a distinction not reflected by QS. Other factors not taken into consideration are course length, syllabus content, fees, or whether the campus has a half-decent bar. Yet the very nature of generic ranking means it’s impossible to encapsulate the multitude of variables that go into deciding which university is worth your time. Reputation and publication data might not be perfect (and both have flaws), but they’re probably the best indicators available.
There is also the uncomfortable truth that rankings matter. We read them, we talk about them and universities use them extensively to recruit students and staff. In 2014 the European University Association asked 171 higher education institutions in 29 European countries if rankings mattered. It found that only around one in three universities didn’t change their institutional policies because of rankings, with most considering the results a benchmark to compare themselves against rivals. These institutional policy changes were almost always focused on one thing: international students. Indeed, 91% of universities said they paid special attention to them in strategic planning – more than student satisfaction, research income, publications or employment after graduation. More, the report found that governments are particularly interested in boosting their national prestige by having top universities – a juicy funding carrot for those who do well.
The flaws are easy to spot, but university rankings are going to drive policy for some time to come. Everything may be made up, but it seems the points certainly do matter.
No comments yet