Res Judicata

Today is the day people working in British Universities have waited for in a mixture of hope and apprehension for several years. The results of the 2008 Research Assessment Exercise (RAE) were published at 0.01am GMT today (18th December).

I had a look just after midnight and the webserver crashed, but only for a few minutes and I soon got back in and found the bad news. The relevant one for me as an astrophysicist is the table for Unit of Assessment 19 which is Physics & Astronomy. Results are given as a list of numbers, consisting of the number of staff entered (not necessarily an integer, for accounting reasons) followed by the percentage of work judged by the panel to be in each of four categories explained in the following excerpt from the RAE website

The quality profiles displayed on this website are the results of the 2008 Research Assessment Exercise (RAE2008), the sixth assessment in this current format of the quality of research conducted in UK Higher Education Institutions (HEIs). The UK funding bodies for England, Northern Ireland, Scotland and Wales will use the RAE2008 results to distribute funding for research from 2009-10.

The results follow an expert review process conducted by assessment panels throughout 2008. Research in all subjects was assessed against agreed quality standards within a common framework that recognised appropriate variations between subjects in terms of both the research submitted and the assessment criteria.

Submissions were made in a standard form that included both quantitative and descriptive elements. Full details of the contents of, and arrangements for making, submissions were published in ‘Guidance on submissions‘ (RAE 03/2005).

The RAE quality profiles present in blocks of 5% the proportion of each submission judged by the panels to have met each of the quality levels defined below. Work that fell below national quality or was not recognised as research was unclassified.

4* Quality that is world-leading in terms of originality, significance and rigour.
3* Quality that is internationally excellent in terms of originality, significance and rigour but which nonetheless falls short of the highest standards of excellence.
2* Quality that is recognised internationally in terms of originality, significance and rigour.
1* Quality that is recognised nationally in terms of originality, significance and rigour.
Unclassified Quality that falls below the standard of nationally recognised work. Or work which does not meet the published definition of research for the purposes of this assessment.

The ‘international’ criterion equates to a level of excellence that it was reasonable to expect for the UOA, even though there may be no current examples of such a level in the UK or elsewhere. It should be noted that ‘national’ and ‘international’ refer to standards, not to the nature or geographical scope of particular subjects.

For my own department, the School of Physics & Astronomy, at Cardiff University, I found the following

Cardiff University (32.30) 5 45 30 20

which means that we entered 32.30 people, but only 5% of the work was judged to be at the top level (4*), 45% at 3*, 30% at 2* and 20% at 1*. On their own these figures don’t mean very much but one can do a quick comparison with the rest of the table to see that for us this is an enormous disappointment. We have a much lower fraction of 4* than the majority of departments, and also a significantly higher fraction of 1*. These findings are very worrying.

If I were working an English University with these results I would be very concerned about their financial implications, but it’s a bit more complicated with us being here in Wales. The numbers given in the table are translated into money by the funding councils and Wales has its own one of these (HEFCW, different from the English HEFCE). There are many fewer physics departments in Wales and we’re not competing with the bigger English ones for funding. We don’t yet know how much our research funds will be cut. It might not be as bad as if we were in England, but it’s clearly not good. We won’t know how much dosh will be involved until March 2009. t’s not just a matter of funding, it’s also the national and international perception of the department in the physics community.

I can see there will be a post mortem to find out what went wrong, as most of us were confident of a much better outcome. Perhaps the format of the RAE (focussing on research papers as the measure of output) is not favourable to a department with so many instrument builders in it?

But with the economy in deep recession making further cuts in research funding likely in the future, and our major external funder (STFC) already struggling to make ends meet, this poor showing in the RAE this has cast a gloomy shadow over Christmas.

Of course many places did much better, including my old department at Nottingham which has

University of Nottingham (44.45) 25 40 30 5

which can be interestingly compared with Cambridge, who have

University of Cambridge (141.25) 25 40 30 5

You can see that apart from the different numbers of staff the profile is exactly the same. I’m sure their publicity machine will pick up on this so I won’t be the last to mention it! Well done, Nottingham!

It will be interesting to see what the newspapers make of the new RAE results. They are significantly more complicated than previous versions which just gave a single number. The scope for flexibility in generating league tables is clearly greatly enhanced by this complexity so we can bet the hacks will have a field day. I thought I’d get a headstart by doing a straightforward ranking using a simple weighted average using 4=4*, 3=3*, etc and then sorting them by the average thus obtained:

1. Lancaster University 2.9
2. University of Bath 2.85
3. University of Cambridge 2.85
4. University of Nottingham 2.85
5. University of St Andrews 2.85
6. University of Edinburgh 2.8
7. University of Durham 2.75
8. Imperial College London 2.75
9. University of Sheffield 2.75
10. University College London 2.75
11. University of Glasgow 2.75
12. University of Birmingham 2.7
13. University of Exeter 2.7
14. University of Sussex 2.7
15. University of Bristol 2.65
16. University of Liverpool 2.65
17. University of Oxford 2.65
18. University of Southampton 2.65
19. Heriot-Watt University 2.65
20. University of Hertfordshire 2.6
21. University of Manchester 2.6
22. University of Warwick 2.6
23. University of York 2.6
24. King’s College London 2.55
25. University of Leeds 2.55
26. University of Leicester 2.55
27. Royal Holloway, University of London 2.55
28. University of Surrey 2.55
29. Swansea University 2.55
30. Queen Mary, University of London 2.5
31. Queen’s University Belfast 2.5
32. Loughborough University 2.45
33. Liverpool John Moores University 2.4
34. University of Strathclyde 2.35
35. Cardiff University 2.35
36. University of Brighton 2.3
37. University of Central Lancashire 2.3
38. Keele University 2.25
39. Armagh Observatory 2.25
40. University of Kent 2.2
41. Aberystwyth University 1.95
42. University of the West of Scotland 1.8

So you can see we are languishing at 35th place out of 42.

This is supposed to be the last RAE and we don’t know what is going to replace it. I don’t at all object to the principle that research funding should be peer-assessed but this particular exercise was enormously expensive in the effort spent at Universities preparing for it, not to mention the ridiculous burden placed on the panel of having to read all those papers.

10 Responses to “Res Judicata”

  1. Michael Merrifield Says:

    Of course many places did much better, including my old department at Nottingham

    We did? Gosh! No-one around here has even mentioned it…

  2. Yes, it’s interesting. All the places I used to work have gone rocketing up the table and Cardiff has gone down. The inference is obvious.

  3. […] In the Dark A blog about the Universe, and all that surrounds it « Res Judicata […]

  4. […] newspapers, so I shouldn’t be surprised with the large number of hits my previous post and the one before that about the Research Assessment Exercise has […]

  5. […] points, some of which relate to comments posted on my previous items about the RAE results (here and here) until I terminated the […]

  6. […] for research following the 2008 Research Assessment Exercise. I have blogged about this previously (here, there and elsewhere), but to give you a quick reminder the exercise basically graded all research […]

  7. […] who gave the results of the latest Research Assessment Exercise (which I’ve blogged about here, there and everywhere). To my dismay he announced that HEFCW are indeed going to use the 0:1:3:7 […]

  8. […] think I’ve made it clear (here, here, here, here and here) that I think the RAE was a bit of a botch generally and that Physics was […]

  9. […] bad RAE result for Cardiff. Peter Coles has analysed the RAE results several times over. In this first post, he listed straight weighted mean scores (in which Cardiff came 35th). In a second post, he […]

  10. […] blogged about the RAE results before: here, there, elsewhere, et cetera and passim. Andy Lawrence (e-astronomer) has now written a blog post […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: