Prognostications unveiled for how UK universities will fare in latest assessment exercise

iStock

Can metrics assess research quality more accurately than an in-depth peer review exercise?

In terms of funding and reputation, the UK’s Research Excellence Framework (REF) exercise is a vital event in the academic calendar. Now a team of researchers has made predictions about the results of the latest assessment using citation-based metrics, rather than peer review. They will know how well their predictions compare to the real thing when the latest REF results are published on Thursday.

The REF, which replaced the Research Assessment Exercise, uses peer review to assess the quality of research in UK higher education institutes, usually by examining a selection of academic papers. This exercise takes place every four to seven years, with the last taking place in 2008. The government then uses the results to allocate funding and help determine research rankings.

The REF is an expensive, time-consuming and disruptive exercise, says Ralph Kenna of Coventry University, who was part of an international team that carried out the work. Research managers and policymakers have suggested replacing it with a simple system based on metrics such as citations. ‘[They] are in favour because metrics are cheap, fast and easy to (mis)understand. But academics are strongly against this because metrics are crude and only give a certain angle on things.’

The researchers studied the correlations between two departmental impact metrics and the scores from RAE 2008 for biology, chemistry, physics and sociology. They used a version of the Hirsch index, known as the departmental h-index, which attempts to measure the productivity and citation impact of a department. The second measure was the normalised citation impact (NCI), an indicator of departmental academic impact in a given discipline.

Of the two, the researchers found that the h-index predictions were closest to those recorded in past peer review exercises. The correlation between the h-indices and RAE 2008-based scores were the strongest in chemistry. The team then determined departmental h-indices for different institutions based upon their outputs in these subjects in the run-up to REF 2014, and ranked them. Based on h-indices, they predict that the 2014 REF will rank Imperial College London joint top chemistry department alongside the University of Cambridge, followed by the universities of Oxford, Manchester and Liverpool.

Poor predictors

‘We have shown that metrics do not correlate sufficiently well with RAE; there is some positive correlation but it is weak,’ says Kenna. ‘If the correlations are poor, then this is a nail in the coffin for any attempt to substitute peer review with metrics. If correlations are good, one can perhaps investigate further and try to further improve metrics.’

Kenna and his colleagues are against a purely metric system. ‘If citation-based indicators are introduced, managers will force researchers to try to maximise them,’ says Kenna. ‘Then researchers will chase metrics by doing fashionable research only. This will curtail academic freedom and undermine curiosity driven research.’

‘The REF is vitally important to us, both tangibly as a funding formula and intangibly as an indicator of status,’ says Tom Welton, head of chemistry at Imperial College London, UK. He is a big fan of metrics for the analysis of chemistry. However, he recognises that they don’t necessarily work for all subjects, particularly in the arts. ‘The obvious solution is to recognise that there is no good reason to judge all subjects using precisely the same tools. What matters is that there is an equivalence of rigour in all of the methods applied. If I were designing the next REF, I would generate a suite of tools that could be used from which individual subject panels could select a set number. So if chemistry wanted to use h-index, income, number of PhD students and impact then it can, but history of art could pick a different selection.’

Rudolf Allemann, head of the department of chemistry at Cardiff University, UK, does not like the idea of an assessment based solely on bibliometrics. ‘Some of the most important achievements in the past would have been missed or not properly assessed [if decision were made purely on metrics],’ he says. ‘H-indices over-stress certain fields while other smaller but central aspects of the discipline do worse.’