Value Added?

Busy busy busy. Only a few minutes for a lunchtime post today. I’ve a feeling I’m going to be writing that rather a lot over the next few weeks. Anyway, I thought I’d use the opportunity to enlist the help of the blogosphere to try to solve a problem for me.

Yesterday I drew attention to the Guardian University league tables for Physics (purely for the purposes of pointing out that excellent departments exist outside the Russell Group). One thing I’ve never understood about these legal tables is the column marked “value added”. Here is the (brief) explanation offered:

The value-added score compares students’ individual degree results with their entry qualifications, to show how effective the teaching is. It is given as a rating out of 10.

If you look at the scores you will find the top department, Oxford, has a score of 6 for “value added”;  in deference to my alma matter, I’ll note that Cambridge doesn’t appear in these tables.  Sussex scores 9 on value-added, while  Cardiff only scores 2. What seems peculiar is that the “typical UCAS scores” for students in these departments are 621, 409 and 420 respectively. To convert these into A-level scores, see here. These should represent the typical entry qualifications of students at the respective institutions.

The point is that Oxford only takes students with very high A-level grades, yet still manages to score a creditable 6/10 on “value added”.  Sussex and Cardiff have very similar scores for entry tariff, significantly lower than Oxford, but differ enormously in “value added” (9 versus 2).

The only interpretation of the latter two points that makes sense to me would be if Sussex turned out many more first-class degrees given its entry qualifications than Cardiff (since their tariff levels are similar, 409 versus 420). But this doesn’t seem to be the case;  the fraction of first-class degrees awarded by Cardiff Physics & Astronomy is broadly in line with the rest of the sector and certainly doesn’t differ by a factor of several compared to Sussex!

These aren’t the only anomalous cases. Elsewhere in the table you can find Exeter and Leeds, which have identical UCAS tariffs (435) but value added scores that differ by a wide margin (9 versus 4, respectively).

And if Oxford only accepts students with the highest A-level scores, how can it score higher on “value added” than a department like Cardiff which takes in many students with lower A-levels and turns at least some of them into first-class graduates? Shouldn’t the Oxford “value added” score be very low indeed, if any Oxford students at all fail to get first class degrees?

I think there’s a rabbit off. Can anyone explain the paradox to me?

Answers on a postcard please. Or, better, through the comments box.

10 Responses to “Value Added?”

  1. Just as an unsubstantiated guess, perhaps the scores include some sort of a judgement about the relative quality of the first-class degrees awarded? If a first from Cardiff isn’t worth as much as a first from Oxford or Sussex (I’m not necessarily saying it isn’t), then that would explain the difference, even when the percentage of firsts obtained are similar.

  2. I presume, from the description, that it is not a relationship between the typical entry grades and typical exit grades, but is – in a sense – based on each student’s relative entry and exit grades. If there is a direct correlation between your entry and exit grades (all students who get first-class degrees entered with A grades at A-level), your score will be low. If, however, some who enter with poor grades get a better degree classification, your score will be higher. This doesn’t seem, however, to really match with your observation that many students you’ve encountered perform better than one would expect. Alternatively someone in your university administration sent the Guardian the wrong numbers 🙂

  3. Andrew Liddle Says:

    Since the numbers go all the way from 1 to 9, I expect they’ve taken a statistic with a tiny dynamic range which is largely noise dominated, and mapped it to the range 1 to 9. If so, we can expect them all to shuffle around randomly next year.

  4. Yes, most odd. In their explanation of the Value Added score, the company apparently responsible ( explain: “Based upon a sophisticated indexing methodology that tracks students from enrolment to graduation, qualifications upon entry are compared with the award that a student receives at the end of their studies. Each full time student is given a probability of achieving a 1st or 2:1, based on the qualifications that they enter with. If they manage to earn a good degree then they score points which reflect how difficult it was to do so (in fact, they score the reciprocal of the probability of getting a 1st or 2:1). Thus an institution that is adept at taking in students with low entry qualifications, which are generally more difficult to convert into a 1st or 2:1, will score highly in the value-added measure if the number of students getting a 1st or 2:1 exceeds expectations.”

    Three points stand out to me from all this: (1) they call the method ‘sophisticated’, generally a bad sign, (2) they try to be specific about the result (‘the reciprocal of the probability of getting a 1st or 2:1’), but don’t say whether the entry qualification is taken from the UCAS score, and (3) your conclusion seems borne out in the last sentence about high-scoring institutions being those which are “adept at taking in students with low entry qualifications”: the figures don’t add up.

    • telescoper Says:

      Perhaps they called it “sophisticated” so they could charge a higher fee for their services?

    • It seems, if I read it properly, a student scores points if they achieve a 1st or 2:1 and the points scored depends on the probability of the student achieving that in the first place. What it doesn’t say is, what happens if students do not achieve a 1st or 2:1 but were expected to do so. It seems like there is no penalty if a reasonable number of those expected to get a 1st or 2:1 do not do so. I suspect, however, that Andrew’s analysis is probably the correct one – a noise dominated statistic with a large dynamic range.

  5. Well, I looked at the table yesterday and Keele Physics did *not* have a value added score. Today it does and it is only 3/10. Frankly I think this is rubbish. We have one of the lowest entry tariffs, almost none of our entrants have A-grades at A-level, yet we turn out 50 percent of students with firsts and 2:1s. I think this stinks.

  6. I would also like to know the answer.

    Is our assumption about the definition of a value-added score correct, however, or might factors other than entry qualifications and degree class come into the equation?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: