Archive for League Tables

More Worthless University Rankings

Posted in Bad Statistics, Education with tags , , , on September 6, 2017 by telescoper

The Times Higher World University Rankings, which were released this week. The main table can be found here and the methodology used to concoct them here.

Here I wish to reiterate the objection I made last year and the year before that to the way these tables are manipulated year on year to create an artificial “churn” that renders them unreliable and impossible to interpret in any objective way. In other words, they’re worthless. This year the narrative text includes:

This year’s list of the best universities in the world is led by two UK universities for the first time. The University of Oxford has held on to the number one spot for the second year in a row, while the University of Cambridge has jumped from fourth to second place.

Overall, European institutions occupy half of the top 200 places, with the Netherlands and Germany joining the UK as the most-represented countries. Italy, Spain and the Netherlands each have new number ones.

Another notable trend is the continued rise of China. The Asian giant is now home to two universities in the top 30: Peking and Tsinghua. The Beijing duo now outrank several prestigious institutions in Europe and the US. Meanwhile, almost all Chinese universities have improved, signalling that the country’s commitments to investment has bolstered results year-on-year.

In contrast, two-fifths of the US institutions in the top 200 (29 out of 62) have dropped places. In total, 77 countries feature in the table.

These comments are all predicated on the assumption that any changes since the last tables represent changes in data (which in turn are assumed to be relevant to how good a university is) rather than changes in the methodology used to analyse that data. Unfortunately, every single year the Times Higher changes its methodology. This time we are told:

This year, we have made a slight improvement to how we handle our papers per academic staff calculation, and expanded the number of broad subject areas that we use.

What has been the effect of these changes? We are not told. The question that must be asked is how can we be sure that any change in league table position for an institution from year to year represents a change in “performance”,rather than a change in the way metrics are constructed and/or combined? Would you trust the outcome of a medical trial in which the response of two groups of patients (e.g. one given medication and the other placebo) were assessed with two different measurement techniques?

There is an obvious and easy way to test for the size of this effect, which is to construct a parallel set of league tables, with this year’s input data but last year’s methodology, which would make it easy to isolate changes in methodology from changes in the performance indicators. The Times Higher – along with other purveyors of similar statistical twaddle – refuses to do this. No scientifically literate person would accept the result of this kind of study unless the systematic effects can be shown to be under control. There is a very easy way for the Times Higher to address this question: all they need to do is publish a set of league tables using, say, the 2016/17 methodology and the 2017/18 data, for comparison with those constructed using this year’s methodology on the 2017/18 data. Any differences between these two tables will give a clear indication of the reliability (or otherwise) of the rankings.

I challenged the Times Higher to do this last year, and they refused. You can draw your own conclusions about why.

P.S. For the record, Cardiff University is 162nd in this year’s table, a rise of 20 places on last year. My former institution, the University of Sussex, is up two places to joint 147th. Whether these changes are anything other than artifacts of the data analysis I very much doubt.

Advertisements

Why Universities should ignore League Tables

Posted in Bad Statistics, Education with tags , , , , , on January 12, 2017 by telescoper

Very busy day today but I couldn’t resist a quick post to draw attention to a new report by an independent think tank called the Higher Education Policy Institute  (PDF available here; high-level summary there). It says a lot of things that I’ve discussed on this blog already and I agree strongly with most of the conclusions. The report is focused on the international league tables, but much of what it says (in terms of methodological criticism) also applies to the national tables. Unfortunately, I doubt if this will make much difference to the behaviour of the bean-counters who have now taken control of higher education, for whom strategies intended to ‘game’ position in these, largely bogus, tables seem to be the main focus of their policy rather than the pursuit of teaching and scholarship, which is what should universities actually be for.

Here is the introduction to high-level summary:

Rankings of global universities, such as the THE World University Rankings, the QS World University Rankings and the Academic Ranking of World Universities claim to identify the ‘best’ universities in the world and then list them in rank order. They are enormously influential, as universities and even governments alter their policies to improve their position.

The new research shows the league tables are based almost exclusively on research-related criteria and the data they use are unreliable and sometimes worse. As a result, it is unwise and undesirable to give the league tables so much weight.

Later on we find some recommendations:

The report considers the inputs for the various international league tables and discusses their overall weaknesses before considering some improvements that could be made. These include:

  • ranking bodies should audit and validate data provided by universities;
  • league table criteria should move beyond research-related measures;
  • surveys of reputation should be dropped, given their methodological flaws;
  • league table results should be published in more complex ways than simple numerical rankings; and
  • universities and governments should not exaggerate the importance of rankings when determining priorities.

No doubt the purveyors of these ranking – I’ll refrain from calling them “rankers” – will mount a spirited defence of their business, but I agree with the view expressed in this report that as they stand these league tables are at best meaningless and at worst damaging.

The Worthless University Rankings

Posted in Bad Statistics, Education with tags , , , on September 23, 2016 by telescoper

The Times Higher World University Rankings, which were released this weekk. The main table can be found here and the methodology used to concoct them here.

Here I wish to reiterate the objection I made last year to the way these tables are manipulated year on year to create an artificial “churn” that renders them unreliable and impossible to interpret in an objective way. In other words, they’re worthless. This year, editor Phil Baty has written an article entitled Standing still is not an option in which he makes a statement that “the overall rankings methodology is the same as last year”. Actually it isn’t. In the page on methodology you will find this:

In 2015-16, we excluded papers with more than 1,000 authors because they were having a disproportionate impact on the citation scores of a small number of universities. This year, we have designed a method for reincorporating these papers. Working with Elsevier, we have developed a new fractional counting approach that ensures that all universities where academics are authors of these papers will receive at least 5 per cent of the value of the paper, and where those that provide the most contributors to the paper receive a proportionately larger contribution.

So the methodology just isn’t “the same as last year”. In fact every year that I’ve seen these rankings there’s been some change in methodology. The change above at least attempts to improve on the absurd decision taken last year to eliminate from the citation count any papers arising from large collaborations. In my view, membership of large world-wide collaborations is in itself an indicator of international research excellence, and such papers should if anything be given greater not lesser weight. But whether you agree with the motivation for the change or not is beside the point.

The real question is how can we be sure that any change in league table position for an institution from year to year are is caused by methodological tweaks rather than changes in “performance”, i.e. not by changes in the metrics but by changes in the way they are combined? Would you trust the outcome of a medical trial in which the response of two groups of patients (e.g. one given medication and the other placebo) were assessed with two different measurement techniques?

There is an obvious and easy way to test for the size of this effect, which is to construct a parallel set of league tables, with this year’s input data but last year’s methodology, which would make it easy to isolate changes in methodology from changes in the performance indicators. The Times Higher – along with other purveyors of similar statistical twaddle – refuses to do this. No scientifically literate person would accept the result of this kind of study unless the systematic effects can be shown to be under control. There is a very easy way for the Times Higher to address this question: all they need to do is publish a set of league tables using, say, the 2015/16 methodology and the 2016/17 data, for comparison with those constructed using this year’s methodology on the 2016/17 data. Any differences between these two tables will give a clear indication of the reliability (or otherwise) of the rankings.

I challenged the Times Higher to do this last year, and they refused. You can draw your own conclusions about why.

Rank Nonsense

Posted in Bad Statistics, Education, Politics with tags , , , , , on September 8, 2016 by telescoper

It’s that time of year when international league tables (also known as “World Rankings”)  appear. We’ve already had the QS World University Rankings and the Shanghai (ARWU) World University Rankings. These will soon be joined by the Times Higher World Rankings, due out on 21st September.

A lot of people who should know a lot better give these league tables far too much attention. As far as I’m concerned they are all constructed using extremely suspect methodologies whose main function is to amplify small statistical variations into something that looks significant enough to justify constructing  a narrative about it. The resulting press coverage usually better reflects a preconceived idea in a journalist’s head than any sensible reading of the tables themselves.

A particularly egregious example of this kind of nonsense can be found in this week’s Guardian. The offending article is entitled “UK universities tumble in world rankings amid Brexit concerns”. Now I make no secret of the fact that I voted “Remain” and that I do think BrExit (if it actually happens) will damage UK universities (as well as everything else in the UK). However, linking the changes in the QS rankings to BrExit is evidently ridiculous: all the data were collected before the referendum on 23rd June anyway! In my opinion there are enough good arguments against BrExit without trying to concoct daft ones.

In any case these tables do not come with any estimate of the likely statistical variation from year to year in the metrics used to construct them, which makes changes impossible to interpret. If only the compilers of these tables would put error bars on the results! Interestingly, my former employer, the University of Sussex, has held its place exactly in the QS rankings between 2015 and 2016: it was ranked 187th in the world in both years. However, the actual score corresponding to these two years was 55.6 in 2015 and 48.4 in 2016. Moreover, Cambridge University fell from 3rd to 4th place this year but its score only changed from 98.6 to 97.2. I very much doubt that is significant at all, but it’s mentioned prominently in the subheading of the Guardian piece:

Uncertainty over research funding and immigration rules blamed for decline, as Cambridge slips out of top three for first time.

Actually, looking closer, I find that Cambridge was joint 3rd in 2015 and is 4th this year. Over-interpretation, or what?

To end with, I can’t resist mentioning that the University of Sussex is in the top 150 in the Shanghai Rankings for Natural and Mathematical Sciences this year, having not been in the top 200 last year. This stunning improvement happened while I was Head of School for Mathematical and Physical Sciences so it clearly can not be any kind of statistical fluke but is entirely attributable to excellent leadership. Thank you for your applause.

 

 

The Rising Stars of Sussex Physics

Posted in Bad Statistics, Biographical, Education with tags , , , , on July 28, 2016 by telescoper

This is my penultimate day in the office in the School of Mathematical and Physical Sciences at the University of Sussex, and a bit of news has arrived that seems a nice way to round off my stint as Head of School.

It seems that Physics & Astronomy research at the University of Sussex has been ranked as 13th in western Europe and 7th in the UK by leading academic publishers, Nature Research, and has been profiled as one of its top-25 “rising stars” worldwide.

I was tempted to describe this rise as ‘meteoric’ but in my experience meteors generally fall down rather than rise up.

Anyway, as regular readers of this blog will know, I’m generally very sceptical of the value of league tables and there’s no reason to treat this one as qualitatively any different. Here is an explanation of the (rather curious) methodology from the University of Sussex news item:

The Nature Index 2016 Rising Stars supplement identifies the countries and institutions showing the most significant growth in high-quality research publications, using the Nature Index, which tracks the research of more than 8,000 global institutions – described as “players to watch”.

The top 100 most improved institutions in the index between 2012 and 2015 are ranked by the increase in their contribution to 68 high-quality journals. From this top 100, the supplement profiles 25 rising stars – one of which is Sussex – that are already making their mark, and have the potential to shine in coming decades.

The institutions and countries examined have increased their contribution to a selection of top natural science journals — a metric known as weighted fractional count (WFC) — from 2012 to 2015.

Mainly thanks to a quadrupling of its physical sciences score, Sussex reached 351 in the Global 500 in 2015. That represents an 83.9% rise in its contribution to index papers since 2012 — the biggest jump of any UK research organisation in the top 100 most improved institutions.

It’s certainly a strange choice of metric, as it only involves publications in “high quality” journals, presumably selected by Journal Impact Factor or some other arbitrary statistical abominatio,  then taking the difference in this measure between 2012 and 2015  and expressing the change as a percentage. I noticed one institution in the list has improved by over 4600%, which makes Sussex’s change of 83.9% seem rather insignificant…

But at least this table provides some sort of evidence that the investment made in Physics & Astronomy over the last few years has made a significant (and positive) difference. The number of research faculty in Physics & Astronomy has increased by more than 60%  since 2012 so one would have been surprised not to have seen an increase in publication output over the same period. On the other hand, it seems likely that many of the high-impact papers published since 2012 were written by researchers who arrived well before then because Physics research is often a slow burner. The full impact of the most recent investments has probably not yet been felt. I’m therefore confident that Physics at Sussex has a very exciting future in store as its rising stars look set to rise still further! It’s nice to be going out on a high note!

 

 

An Open Letter to the Times Higher World University Rankers

Posted in Education, The Universe and Stuff with tags , , , , , , , , on October 5, 2015 by telescoper

Dear Rankers,

Having perused your latest set of league tables along with the published methodology, a couple of things puzzle me.

First, I note that you have made significant changes to your methodology for combining metrics this year. How, then, can you justify making statements such as

US continues to lose its grip as institutions in Europe up their game

when it appears that any changes could well be explained not by changes in performance, as gauged by the metrics you use,  but in the way they are combined?

I assume, as intelligent and responsible people, that you did the obvious test for this effect, i.e. to construct a parallel set of league tables, with this year’s input data but last year’s methodology, which would make it easy to isolate changes in methodology from changes in the performance indicators.  Your failure to publish such a set, to illustrate how seriously your readers should take statements such as that quoted above, must then simply have been an oversight. Had you deliberately witheld evidence of the unreliability of your conclusions you would have left yourselves open to an accusation of gross dishonesty, which I am sure would be unfair.

Happily, however, there is a very easy way to allay the fears of the global university community that the world rankings are being manipulated: all you need to do is publish a set of league tables using the 2014 methodology and the 2015 data. Any difference between this table and the one you published would then simply be an artefact and the new ranking can be ignored. I’m sure you are as anxious as anyone else to prove that the changes this year are not simply artificially-induced “churn”, and I look forward to seeing the results of this straightforward calculation published in the Times Higher as soon as possible.

Second, I notice that one of the changes to your methodology is explained thus

This year we have removed the very small number of papers (649) with more than 1,000 authors from the citations indicator.

You are presumably aware that this primarily affects papers relating to experimental particle physics, which is mostly conducted through large international collaborations (chiefly, but not exclusively, based at CERN). This change at a stroke renders such fundamental scientific breakthroughs as the discovery of the Higgs Boson completely worthless. This is a strange thing to do because this is exactly the type of research that inspires  prospective students to study physics, as well as being direct measures in themselves of the global standing of a University.

My current institution, the University of Sussex, is heavily involved in experiments at CERN. For example, Dr Iacopo Vivarelli has just been appointed coordinator of all supersymmetry searches using the ATLAS experiment on the Large Hadron Collider. This involvement demonstrates the international standing of our excellent Experimental Particle Physics group, but if evidence of supersymmetry is found at the LHC your methodology will simply ignore it. A similar fate will also befall any experiment that requires large international collaborations: searches for dark matter, dark energy, and gravitational waves to name but three, all exciting and inspiring scientific adventures that you regard as unworthy of any recognition at all but which draw students in large numbers into participating departments.

Your decision to downgrade collaborative research to zero is not only strange but also extremely dangerous, for it tells university managers that participating in world-leading collaborative research will jeopardise their rankings. How can you justify such a deliberate and premeditated attack on collaborative science? Surely it is exactly the sort of thing you should be rewarding? Physics departments not participating in such research are the ones that should be downgraded!

Your answer might be that excluding “superpapers” only damages the rankings of smaller universities because might owe a larger fraction of their total citation count to collaborative work. Well, so what if this is true? It’s not a reason for excluding them. Perhaps small universities are better anyway, especially when they emphasize small group teaching and provide opportunities for students to engage in learning that’s led by cutting-edge research. Or perhaps you have decided otherwise and have changed your methodology to confirm your prejudice…

I look forward to seeing your answers to the above questions through the comments box or elsewhere – though you have ignored my several attempts to raise these questions via social media. I also look forward to seeing you correct your error of omission by demonstrating – by the means described above – what  changes in league table positions are by your design rather than any change in performance. If it turns out that the former is the case, as I think it will, at least your own journal provides you with a platform from which you can apologize to the global academic community for wasting their time.

Yours sincerely,

Telescoper

Sussex and the World Premier League of Physics

Posted in Education, The Universe and Stuff with tags , , , , , on August 16, 2014 by telescoper

In the office again busy finishing off a few things before flying off for another conference (of which more anon).

Anyway, I thought I’d take a short break for a cup of tea and a go on the blog.

Today is the first day of the new Premiership season and , coincidentally, last week saw some good news about the Department of Physics and Astronomy at the University of Sussex in a different kind of league table.

The latest (2014) Academic Rankings of World Universities (often called the “Shanghai Rankings”) are out so, as I suspect many of my colleagues also did, I drilled down to look at the rankings of Physics departments.

Not surprisingly the top six (Berkeley, Princeton, MIT, Harvard, Caltech, & Stanford) are all based in the USA. The top British university is, also not surprisingly, Cambridge in 9th place. That’s the only UK university in the top ten for Physics. The other leading UK physics departments are: Manchester (13th), Imperial (15th), Edinburgh (20th), Durham (28th), Oxford (39th) and UCL (47th). I don’t think there will be any surprise that these all made it into the top 50 departments worldwide.

Just outside the top 50 in joint 51st place in the world is the Department of Physics & Astronomy at the University of Sussex. For a relatively small department in a relatively small university this is a truly outstanding result. It puts the Department  clear in 8th place in the UK, ahead of Birmingham, Bristol, Leicester, Queen Mary, Nottingham, Southampton,  St Andrews, Lancaster, Glasgow, Sheffield and Warwick, all of whom made the top 200 in the world.

Incidentally, two of the other departments tied in 51st place are at Nagoya University in Japan (where I visited in January) and Copenhagen University in Denmark (where I’m going next week).

Although I have deep reservations about the usefulness of league tables, I’m not at all averse to using them as an excuse for a celebration and to help raise the profile of Physics and Astronomy at Sussex generally.  I’d therefore like to take the opportunity to offer hearty congratulations to the wonderful staff of the Department of Physics & Astronomy on their achievement. 

With the recent investments we’ve had and further plans for growth I hope over the next few years we can move even further up the rankings. Unless of course the methodology changes or we’re subect to a “random” (ie downward) fluctuation…