Archive for League Tables

IQ in different academic fields – Interesting? Quite!

Posted in Bad Statistics with tags , , , on May 26, 2013 by telescoper

You all know how much I detest league tables, especially those that are based on entirely arbitrary criteria but nevertheless promote a feeling of smug self-satisfaction for those who lucky enough to find themselves at the top. So when my attention was drawn to a blog post that shows (or purports to show) the variation of average IQ across different academic disciplines I decided to post the corresponding ranking with the usual health warning that IQ tests only measure a subject’s ability to do IQ tests. This isn’t even based on IQ test results per se, but on a conversion between the Graduate Record Examination (GRE) results and IQ which may be questionable. Moreover, the differences are really rather small and (as usual) no estimate of sampling uncertainty is provided.

Does this list mean that physicists are smarter than anyone else? You might say that. I couldn’t possibly comment…

  • 130.0 Physics
  • 129.0 Mathematics
  • 128.5 Computer Science
  • 128.0 Economics
  • 127.5 Chemical engineering
  • 127.0 Material science
  • 126.0 Electrical engineering
  • 125.5 Mechanical engineering
  • 125.0 Philosophy
  • 124.0 Chemistry
  • 123.0 Earth sciences
  • 122.0 Industrial engineering
  • 122.0 Civil engineering
  • 121.5 Biology
  • 120.1 English/literature
  • 120.0 Religion/theology
  • 119.8 Political science
  • 119.7 History
  • 118.0 Art history
  • 117.7 Anthropology/archeology
  • 116.5 Architecture
  • 116.0 Business
  • 115.0 Sociology
  • 114.0 Psychology
  • 114.0 Medicine
  • 112.0 Communication
  • 109.0 Education
  • 106.0 Public administration

Never mind the table, look at the sample size!

Posted in Bad Statistics with tags , , , on April 29, 2013 by telescoper

This morning I was just thinking that it’s been a while since I’ve filed anything in the category marked bad statistics when I glanced at today’s copy of the Times Higher and found something that’s given me an excuse to rectify my lapse. Last week saw the publication of said organ’s new Student Experience Survey which ranks  British Universities in order of the responses given by students to questions about various aspects of the teaching, social life and so  on. I had a go at this table a few years ago, but they still keep trotting it out. Here are the main results, sorted in decreasing order:

University Score Resp.
1 University of East Anglia 84.8 119
2 University of Oxford 84.2 259
3 University of Sheffield 83.9 192
3 University of Cambridge 83.9 245
5 Loughborough University 82.8 102
6 University of Bath 82.7 159
7 University of Leeds 82.5 219
8 University of Dundee 82.4 103
9 York St John University 81.2 88
10 Lancaster University 81.1 100
11 University of Southampton 80.9 191
11 University of Birmingham 80.9 198
11 University of Nottingham 80.9 270
14 Cardiff University 80.8 113
14 Newcastle University 80.8 125
16 Durham University 80.3 188
17 University of Warwick 80.2 205
18 University of St Andrews 79.8 109
18 University of Glasgow 79.8 131
20 Queen’s University Belfast 79.2 101
21 University of Hull 79.1 106
22 University of Winchester 79 106
23 Northumbria University 78.9 100
23 University of Lincoln 78.9 103
23 University of Strathclyde 78.9 107
26 University of Surrey 78.8 102
26 University of Leicester 78.8 105
26 University of Exeter 78.8 130
29 University of Chester 78.7 102
30 Heriot-Watt University 78.6 101
31 Keele University 78.5 102
32 University of Kent 78.4 110
33 University of Reading 78.1 101
33 Bangor University 78.1 101
35 University of Huddersfield 78 104
36 University of Central Lancashire 77.9 121
37 Queen Mary, University of London 77.8 103
37 University of York 77.8 106
39 University of Edinburgh 77.7 170
40 University of Manchester 77.4 252
41 Imperial College London 77.3 148
42 Swansea University 77.1 103
43 Sheffield Hallam University 77 102
43 Teesside University 77 103
45 Brunel University 76.6 110
46 University of Portsmouth 76.4 107
47 University of Gloucestershire 76.3 53
47 Robert Gordon University 76.3 103
47 Aberystwyth University 76.3 104
50 University of Essex 76 103
50 University of Glamorgan 76 108
50 Plymouth University 76 112
53 University of Sunderland 75.9 100
54 Canterbury Christ Church University 75.8 102
55 De Montfort University 75.7 103
56 University of Bradford 75.5 52
56 University of Sussex 75.5 102
58 Nottingham Trent University 75.4 103
59 University of Roehampton 75.1 102
60 University of Ulster 75 101
60 Staffordshire University 75 102
62 Royal Veterinary College 74.8 50
62 Liverpool John Moores University 74.8 102
64 University of Bristol 74.7 137
65 University of Worcester 74.4 101
66 University of Derby 74.2 101
67 University College London 74.1 102
68 University of Aberdeen 73.9 105
69 University of the West of England 73.8 101
69 Coventry University 73.8 102
71 University of Hertfordshire 73.7 105
72 London School of Economics 73.5 51
73 Royal Holloway, University of London 73.4 104
74 University of Stirling 73.3 54
75 King’s College London 73.2 105
76 Bournemouth University 73.1 103
77 Southampton Solent University 72.7 102
78 Goldsmiths, University of London 72.5 52
78 Leeds Metropolitan University 72.5 106
80 Manchester Metropolitan University 72.2 104
81 University of Liverpool 72 104
82 Birmingham City University 71.8 101
83 Anglia Ruskin University 71.7 102
84 Glasgow Caledonian University 71.1 100
84 Kingston University 71.1 102
86 Aston University 71 52
86 University of Brighton 71 106
88 University of Wolverhampton 70.9 103
89 Oxford Brookes University 70.5 106
90 University of Salford 70.2 102
91 University of Cumbria 69.2 51
92 Napier University 68.8 101
93 University of Greenwich 68.5 102
94 University of Westminster 68.1 101
95 University of Bedfordshire 67.9 100
96 University of the Arts London 66 54
97 City University London 65.4 102
97 London Metropolitan University 65.4 103
97 The University of the West of Scotland 65.4 103
100 Middlesex University 65.1 104
101 University of East London 61.7 51
102 London South Bank University 61.2 50
Average scores 75.5 11459
YouthSight is the source of the data that have been used to compile the table of results for the Times Higher Education Student Experience Survey, and it retains the ownership of those data. Each higher education institution’s score has been indexed to give a percentage of the maximum score attainable. For each of the 21 attributes, students were given a seven-point scale and asked how strongly they agreed or disagreed with a number of statements based on their university experience.

My current employer, the University of Sussex, comes out right on the average (75.5)  and is consequently in the middle in this league table. However, let’s look at this in a bit more detail.  The number of students whose responses produced the score of 75.5 was just 102. That’s by no means the smallest sample in the survey, either. The University of Sussex has over 13,000 students. The score in this table is therefore obtained from less than 1% of the relevant student population. How representative can the results be, given that the sample is so incredibly small?

What is conspicuous by its absence from this table is any measure of the “margin-of-error” of the estimated score. What I mean by this is how much the sample score would change for Sussex if a different set of 102 students were involved. Unless every Sussex student scores exactly 75.5 then the score will vary from sample to sample. The smaller the sample, the larger the resulting uncertainty.

Given a survey of this type it should be quite straightforward to calculate the spread of scores from student to student within a sample from a given University in terms of the standard deviation, σ, as well as the mean score. Unfortunately, this survey does not include this information. However, lets suppose for the sake of argument that the standard deviation for Cardiff is quite small, say 10% of the mean value, i.e. 7.55. I imagine that it’s much larger than that, in fact, but this is just meant to be by way of an illustration.

If you have a sample size of  N then the standard error of the mean is going to be roughly (σ⁄√N) which, for Sussex, is about 0.75. Assuming everything has a normal distribution, this would mean that the “true” score for the full population of Sussex students has a 95% chance of being within two standard errors of the mean, i.e. between 74 and 77. This means Sussex could really be as high as 43rd place or as low as 67th, and that’s making very conservative assumptions about how much one student differs from another within each institution.

That example is just for illustration, and the figures may well be wrong, but my main gripe is that I don’t understand how these guys can get away with publishing results like this without listing the margin of error at all. Perhaps its because that would make it obvious how unreliable the rankings are? Whatever the reason we’d never get away with publishing results without errors in a serious scientific journal.

This sampling uncertainty almost certainly accounts for the big changes from year to year in these tables. For instance, the University of Lincoln is 23rd in this year’s table, but last year was way down in 66th place. Has something dramatic happened there to account for this meteoric rise? I doubt it. It’s more likely to be just a sampling fluctuation.

In fact I seriously doubt whether any of the scores in this table is significantly different from the mean score; the range from top to bottom is only 61 to 85 showing a considerable uniformity across all 102 institutions listed. What a statistically literate person should take from this table is that (a) it’s a complete waste of time and (b) wherever you go to University you’ll probably have a good experience!

The League of Small Samples

Posted in Bad Statistics with tags , , , on January 14, 2010 by telescoper

This morning I was just thinking that it’s been a while since I’ve filed anything in the category marked bad statistics when I glanced at today’s copy of the Times Higher and found something that’s given me an excuse to rectify my lapse. Today saw the publication of said organ’s new Student Experience Survey which ranks  British Universities in order of the responses given by students to questions about various aspects of the teaching, social life and so  on. Here are the main results, sorted in decreasing order:

1 Loughborough University 84.9 128
2 University of Cambridge, The 82.6 259
3 University of Oxford, The 82.6 197
4 University of Sheffield, The 82.3 196
5 University of East Anglia, The 82.1 122
6 University of Wales, Aberystwyth 82.1 97
7 University of Leeds, The 81.9 185
8 University of Dundee, The 80.8 75
9 University of Southampton, The 80.6 164
10 University of Glasgow, The 80.6 136
11 University of Exeter, The 80.3 160
12 University of Durham 80.3 189
13 University of Leicester, The 79.9 151
14 University of St Andrews, The 79.9 104
15 University of Essex, The 79.5 65
16 University of Warwick, The 79.5 190
17 Cardiff University 79.4 180
18 University of Central Lancashire, The 79.3 88
19 University of Nottingham, The 79.2 233
20 University of Newcastle-upon-Tyne, The 78.9 145
21 University of Bath, The 78.7 142
22 University of Wales, Bangor 78.7 43
23 University of Edinburgh, The 78.1 190
24 University of Birmingham, The 78.0 179
25 University of Surrey, The 77.8 100
26 University of Sussex, The 77.6 49
27 University of Lancaster, The 77.6 123
28 University of Stirling, The 77.6 44
29 University of Wales, Swansea 77.5 61
30 University of Kent at Canterbury, The 77.3 116
30 University of Teesside, The 77.3 127
32 University of Hull, The 77.2 87
33 Robert Gordon University, The 77.2 57
34 University of Lincoln, The 77.0 121
35 Nottingham Trent University, The 76.9 192
36 University College Falmouth 76.8 40
37 University of Gloucestershire 76.8 74
38 University of Liverpool, The 76.7 89
39 University of Keele, The 76.5 57
40 University of Northumbria at Newcastle, The 76.4 149
41 University of Plymouth, The 76.3 190
41 University of Reading, The 76.3 117
43 Queen’s University of Belfast, The 76.0 149
44 University of Aberdeen, The 75.9 84
45 University of Strathclyde, The 75.7 72
46 Staffordshire University 75.6 85
47 University of York, The 75.6 121
48 St George’s Medical School 75.4 33
49 Southampton Solent University 75.2 34
50 University of Portsmouth, The 75.2 141
51 Queen Mary, University of London 75.2 104
52 University of Manchester 75.1 221
53 Aston University 75.0 66
54 University of Derby 75.0 33
55 University College London 74.8 114
56 Sheffield Hallam University 74.8 159
57 Glasgow Caledonian University 74.6 72
58 King’s College London 74.6 101
59 Brunel University 74.4 64
60 Heriot-Watt University 74.1 35
61 Imperial College of Science, Technology & Medicine 73.9 111
62 De Montfort University 73.6 83
63 Bath Spa University 73.4 64
64 Bournemouth University 73.3 128
65 University of the West of England, Bristol 73.3 207
66 Leeds Metropolitan University 73.1 143
67 University of Chester 72.5 61
68 University of Bristol, The 72.3 145
69 Royal Holloway, University of London 72.1 59
70 Canterbury Christ Church University 71.8 78
71 University of Huddersfield, The 71.8 97
72 York St John University College 71.8 31
72 University of Wales Institute, Cardiff 71.8 41
74 University of Glamorgan 71.6 84
75 University of Salford, The 71.2 58
76 Roehampton University 71.1 47
77 Manchester Metropolitan University, The 71.1 131
78 University of Northampton 70.8 42
79 University of Sunderland, The 70.8 61
80 Kingston University 70.7 121
81 University of Bradford, The 70.6 33
82 Oxford Brookes University 70.5 99
83 University of Ulster 70.3 61
84 Coventry University 69.9 82
85 University of Brighton, The 69.4 106
86 University of Hertfordshire 68.9 138
87 University of Bedfordshire 68.6 44
88 Queen Margaret University, Edinburgh 68.5 35
89 London School of Economics and Political Science 68.4 73
90 Royal Veterinary College, The 68.2 43
91 Anglia Ruskin University 68.1 71
92 Birmingham City University 67.7 109
93 University of Wolverhampton, The 67.5 72
94 Liverpool John Moores University 67.2 103
95 Goldsmiths College 66.9 42
96 Napier University 65.5 63
97 London South Bank University 64.9 44
98 City University 64.6 44
99 University of Greenwich, The 63.9 67
100 University of the Arts London 62.8 40
101 Middlesex University 61.4 51
102 University of Westminster, The 60.4 76
103 London Metropolitan University 55.2 37
104 University of East London, The 54.2 41
10465

The maximum overall score is 100 and the figure in the rightmost column is the number of students from that particular University that contributed to the survey. The total number of students involved is shown at the bottom, i.e. 10465.

My current employer, Cardiff University, comes out pretty well (17th) in this league table, but some do surprisingly poorly such as Imperial which is 61st. No doubt University spin doctors around the country will be working themselves into a frenzy trying how best to present their showing in the list, but before they get too carried away I want to dampen their enthusiasm.

Let’s take Cardiff as an example. The number of students whose responses produced the score of 79.4 was just 180. That’s by no means the smallest sample in the survey, either. Cardiff University has approximately 20,000 undergraduates. The score in this table is therefore obtained from less than 1% of the relevant student population. How representative can the results be, given that the sample is so incredibly small?

What is conspicuous by its absence from this table is any measure of the “margin-of-error” of the estimated score. What I mean by this is how much the sample score would change for Cardiff if a different set of 180 students were involved. Unless every Cardiff student gives Cardiff exactly 79.4 then the score will vary from sample to sample. The smaller the sample, the larger the resulting uncertainty.

Given a survey of this type it should be quite straightforward to calculate the spread of scores from student to student within a sample from a given University in terms of the standard deviation, σ, as well as the mean score. Unfortunately, this survey does not include this information. However, lets suppose for the sake of argument that the standard deviation for Cardiff is quite small, say 10% of the mean value, i.e. 7.94. I imagine that it’s much larger than that, in fact, but this is just meant to be by way of an illustration.

If you have a sample size of  N then the standard error of the mean is going to be roughly (σ⁄√N) which, for Cardiff, is about 0.6. Assuming everything has a normal distribution, this would mean that the “true” score for the full population of Cardiff students has a 95% chance of being within two standard errors of the mean, i.e. between 78.2 and 80.6. This means Cardiff could really be as high as 9th place or as low as 23rd, and that’s making very conservative assumptions about how much one student differs from another within each institution.

That example is just for illustration, and the figures may well be wrong, but my main gripe is that I don’t understand how these guys can get away with publishing results like this without listing the margin of error at all. Perhaps its because that would make it obvious how unreliable the rankings are? Whatever the reason we’d never get away with publishing results without errors in a serious scientific journal.

Still, at least there’s been one improvement since last year: the 2009 results gave every score to two decimal places! My A-level physics teacher would have torn strips off me if I’d done that!

Precision, you see, is not the same as accuracy….

The League of Extraordinary Gibberish

Posted in Bad Statistics with tags , , , on October 13, 2009 by telescoper

After a very busy few days I thought I’d relax yesterday by catching up with a bit of reading. In last week’s Times Higher I found there was a supplement giving this year’s World University Rankings.

I don’t really approve of league tables but somehow can’t resist looking in them to see where my current employer Cardiff University lies. There we are at number 135 in the list of the top 200 Universities. That’s actually not bad for an institute that’s struggling with a Welsh funding  system that seriously disadvantages it compared to our English colleagues. We’re a long way down compared to Cambridge (2nd), UCL (4th), Imperial and Oxford (5th=) . Compared to places I’ve worked at previously we’re significantly below Nottingham (91st) but still above Queen Mary (164) and Sussex (166). Number 1 in the world is Harvard, which is apparently somewhere near Boston (the American one).

Relieved that we’re in the top 200 at all, I decided to have a look at how the tables were drawn up. I wish I hadn’t bothered because I was horrified at the methodological garbage that lies behind it. You can find a full account of the travesty here. In essence, however, the ranking is arrived at by adding six distinct indicators, weighted differently but with weights assigned for no obvious reason, each of which is arrived at by dubious means and which is highly unlikely to mean what it purports. Each indicator is magically turned into a score out of 100 before being added to all the other ones (with appropriate weighting factors).

The indicators are:

  1. Academic Peer Review. This is weighted 40% of the overall score for each institution and is obtained by asking a sample of academics (selected in a way that is not explained). This year 9386 people were involved; they were asked to name institutions they regard as the best in their field. This sample is a tiny fraction of the global academic population and it would amaze me if it were representative of anything at all!
  2. Employer Survey. The pollsters asked 3281 graduate employers for their opinions of the different universities. This was weighted 10%.
  3. Staff-Student Ratio. Counting 20%, this is supposed to be a measure of “teaching quality”! Good teaching = large numbers of staff? Not if most of them don’t teach as at many research universities. A large staff-student ratio could even mean the place is really unpopular!
  4. International Faculty. This measures the  proportion of overseas staff on the books. Apparently a large number of foreign lecturers makes for a good university and “how attractive an institution is around the world”. Or perhaps that it finds it difficult to recruit its own nationals. This one counts only 5%.
  5. International Students. Another 5% goes to the fraction of each of the student body that is from overseas.
  6. Research Excellence. This is measured solely on the basis of citations – I’ve discussed some of the issues with that before – and counts 20%. They choose to use an unreliable database called SCOPUS, run by the profiteering academic publisher Elsevier. The total number of citations is divided by the number of faculty to “give a sense of the density of research excellence” at the institution.

Well I hope by now you’ve got a sense of the density of the idiots who compiled this farrago. Even if you set aside the issue of the accuracy of the input data, there is still the issue of how on Earth anyone could have thought it was sensible to pick such silly ways of measuring what makes a good university, assigning random weights to them, and then claiming that they had achieved something useful. They probably got paid a lot for doing it too. Talk about money for old rope. I’m in the wrong business.

What gives the game away entirely is the enormous variance from indicator to another. This means that changing the weights slightly would produce a drastically different list. And who is to say that the variables should be added linearly anyway? Is a score of 100 really worth precisely twice as much as a score of 50? What do the distributions look like? How significant are the differences in score from one institute to another? And what are we actually trying to measure anyway?

Here’s an example. The University of California at Berkeley scores 100/100 for 1,2 and 4 and 86 for 5. However for Staff/Student ratio (3) it gets a lowly 25/100 and for (6) it gets only 34, which combine take it down to 39th in the table. Exclude this curiously-chosen proxy for teaching quality and Berkeley would rocket up the table.

Of course you can laugh these things off as unimportant trivia to be looked at with mild amusement over a glass of wine, but such things have increasingly found their way into the minds of managers and politicians. The fact that they are based on flawed assumptions, use a daft methodology, and produce utterly meaningless results seems to be irrelevant. Because they are based on numbers they must represent some kind of absolute truth.

There’s nothing at all wrong with collating and publishing information about schools and universities. Such facts should be available to the public. What is wrong is the manic obsession with  condensing disparate sets of conflicting data into a single number just so things can be ordered in lists that politicians can understand.

You can see the same thing going on in the national newspapers’ lists of University rankings. Each one uses a different weighting and different data and the lists are drastically different. They give different answers because nobody has even bothered to think about what the question is.

Follow

Get every new post delivered to your Inbox.

Join 3,284 other followers