A snapshot of the REF resultson 18 December 2014
By Nicholas Allen and Oliver Heath
The results of the 2014 Research Excellence Framework (REF) are finally out. Like most members of the Political Studies Association (PSA) up and down the country, we had been waiting apprehensively for the outcome. Would our discipline have performed better than in 2008? Would our own department have improved its position in the rankings? The REF process has its critics, of course; indeed, there are many PSA members who have voiced their misgivings, some publicly, many privately. Nevertheless, the REF matters. When it comes to authoritative evaluations of research quality, it is the only show in town.
For anyone who has not worked in a higher education institution over the last five years, the REF is the official audit of research across the sector. It is the latest incarnation of what used to be called Research Assessment Exercises (RAEs). For REF2014, potentially every academic department in the UK was entered into a corresponding Unit of Assessment. For most PSA members, the relevant Unit of Assessment was sub-panel 21, ‘Politics and International Studies’. Each department, or more accurately each submission, was then evaluated on the basis of academic outputs, research environment and impact.
Academic outputs are the monographs, journal articles, book chapters and so on that we routinely produce in the course of research. Academics selected to participate in REF2014 by their institution generally submitted four pieces of work, and each piece was rated as being either 4*, 3*, 2*, 1* or unclassified. These individual evaluations were then aggregated for each submission and presented in the form of percentages: the higher the percentage of outputs rated 4*, the stronger the department. In addition to this individual-level evaluation, departments were also evaluated collectively on the ‘vitality and sustainability’ of its research environment and on the impact or ‘reach’ and ‘significance’ of its research. Once again, results were presented in the form of percentages, with a certain share of both the environment and impact being assessed as 4*, 3*, 2*, 1* or unclassified. To ascertain each department’s overall research quality, evaluations of its outputs (weighted at 65 percent) were added to evaluations of its impact (weighted at 20 percent) and research environment (weighted at 15 percent).
The unambiguous good news for all PSA members is that submissions to the Politics and International Studies sub-panel performed very well. More than 68 percent of the overall research quality in our discipline was ranked as either 4* or 3*, that is, as either ‘world-leading’ or ‘internationally recognised’. When the various percentages are combined to produce a grade-point average (GPA), the overall score of 2.90 reflects a marked increase on the comparable GPA of 2.34 in the 2008 RAE. Moreover, nearly all politics departments witnessed substantial improvements on their 2008 scores, with those at Leeds, Strathclyde, Southampton, Westminster and York enjoying the biggest increases. And, on the basis of the discipline’s GPA score of 3.22 in the specific area of ‘impact’, political scientists have demonstrated that their research has real-world meaning and relevance.
For most PSA members, however, the results of REF2014 are more important for how their own department fared. After all, the evaluations will be used to determine the distribution of future research funding. The evaluations also provide a means for ranking departments, and these rankings will almost certainly influence staff and student recruitment and future research applications, not to mention staff morale.
Over the coming days and weeks, universities and departments will crunch REF numbers in various permutations in a bid to present themselves in the most flattering light. Perhaps the most straightforward way of ranking departments is either by the percentage of overall research quality deemed 4*, or by the overall GPA, which combines the percentages of 4*, 3*, 2*, 1* and unclassified research quality in a single measure. Both are reported in the Times Higher Education’s league table, with the latter being used to determine the rankings.
The following table reports the top-ten politics departments on the basis of their GPAs. The clear winner is the University of Essex’s Department of Government, which not only recorded the highest GPA score of 3.54 but also had an amazing 68 percent of its outputs graded as 4*. This is a tremendous achievement and further cements Essex’s reputation as the leading political science department in the country. It is also worth highlighting the general stability at the top of the rankings. In addition to Essex, which came top in 2008, seven other top-ten departments in 2014 were also top-ten departments in 2008. The big mover was York’s politics department, which climbed 18 places from 26th in the last RAE.
Table 1: Rankings by Overall GPA and % 4* research
The Times Higher Education is also notable for providing a further measure for ranking departments, the so-called ‘research power’ score. This score is the product of the GPA and the number of FTE (or full-time equivalent) researchers submitted by each institution. The effect, as shown in the second column of the next table, is to reward the largest departments, with King’s College London jumping to top of the rankings thanks to the 98 FTE researchers submitted to the Politics and International Relations sub-panel.
Table 2: ‘Research power’ and ‘research intensity’ scores
Weighing departments’ scores by the number of researchers submitted brings us to one final issue surrounding REF2014 and one other way of ranking departments. There has been a great deal of ‘measurement controversy’ over how outputs are graded by the REF sub-panel; but there has also been a great deal of ‘sampling controversy’ over who was selected by departments to form part of the submission, and on what basis. As noted, institutions decided which of their staff were to participate in REF2014, and the pressure to perform well led some departments to adopt a ruthless approach in their selection criteria, only including those members of staff who were fairly certain to deliver 4* or 3* outputs. The upshot is that some institutions have been somewhat exclusive in their strategy, with some large departments submitting a surprisingly small proportion of their ‘active’ research staff.
Another way of ranking institutions that controls for the proportion of eligible staff who were entered has been dubbed the ‘research intensity’ score, as reported in the last column of Table 2. This score not only takes into account how many staff were submitted, but also how many staff were eligible. It therefore arguably provides a more rounded indicator of a department’s research output since it is not biased by the selective exclusion of some members of staff. According to this measure, Essex’s Department of Government once again comes out on top, since it submitted the vast majority of its staff. However, those departments which were more selective in terms of deciding who they submitted drop back a bit.
Ultimately, there is no watertight and completely unbiased way of ranking politics departments. In general, the ‘big five’ departments at Essex, the LSE, Oxford, UCL and Warwick come out top from REF2014, whatever ranking system is used. Nevertheless, the problem of sampling cannot and should not be overlooked ahead of preparations for REF2020. REF2014 was not a census of research quality by any means. Some departments sought to include as many eligible staff as possible; others, for whatever reason, did not. All of this means that not all departments were being compared equally. Policy makers and colleagues alike should consider carefully whether they want REF2020 to be based on a genuine census of research—or a select sample of outputs.