Turning the Tables

In Andy Fabian‘s Presidential Address to the Royal Astronomical Society (published in the June 2010 issue of Astronomy and Geophysics) he discusses the impact of UK astronomy both on academic research and wider society. It’s a very interesting article that makes a number of good points, not the least of which is how difficult it is to measure “impact” for a fundamental science such as astronomy. I encourage you all to read the piece.

One of the fascinating things contained in that article is the following table, which shows the number of papers published in Space Sciences (including astronomy) in the period 1999-2009 (2nd column) with their citation counts (3rd Column) and citations per paper (4th column):

USA 53561 961779 17.96
UK(not NI) 18288 330311 18.06
Germany 16905 279586 16.54
England 15376 270290 17.58
France 13519 187830 13.89
Italy  11485 172642 15.03
Japan 8423 107886 12.81
Canada 5469 102326 18.71
Netherlands 5604 100220 17.88
Spain 6709 88979 13.26
Australia 4786 83264 17.40
Chile 3188 57732 18.11
Scotland 2219 48429 21.82
Switzerland 2821 46973 16.65
Poland 2563 32362 12.63
Sweden 2065 30374 14.71
Israel 1510 29335 19.43
Denmark 1448 26156 18.06
Hungary 761 16925 22.24
Portugal  780 13258 17.00
Wales 693 11592 16.73

I’m not sure why Northern Ireland isn’t included, but I suspect it’s because the original compilation (from the dreaded ISI Thompson database) lists England, Scotland, Wales and Northern Ireland separately and the latter didn’t make it into the top twenty; the entry for the United Kingdom is presumably constructed from the numbers for the other three. Of course many highly-cited papers involve international collaborations, so some of the papers will be in common to more than one country.

Based on citation counts alone you can see that the UK is comfortably in second place, with a similar count per paper to the USA.  However, the number that really caught my eye is Scotland’s citations per paper which, at 21.82, is significantly higher than most. In fact, if you sort by this figure rather than by the overall citation count then the table looks very different:

 

Hungary 761 16925 22.24
Scotland 2219 48429 21.82
Israel 1510 29335 19.43
Canada 5469 102326 18.71
Chile 3188 57732 18.11
UK (not NI) 18288 330311 18.06
Denmark 1448 26156 18.06
USA 53561 961779 17.96
Netherlands 5604 100220 17.88
England 15376 270290 17.58
Australia 4786 83264 17.40
Portugal  780 13258 17.00
Wales 693 11592 16.73
Switzerland 2821 46973 16.65
Germany 16905 279586 16.54
Italy  11485 172642 15.03
Sweden 2065 30374 14.71
France 13519 187830 13.89
Spain 6709 88979 13.26
Japan 8423 107886 12.81
Poland 2563 32362 12.63

Wales climbs to a creditable 13th place while the UK as a whole falls to 6th. Scotland is second only to Hungary. Hang on. Hungary? Why does Hungary have an average of  22.24 citations per paper? I’d love to know.  The overall number of papers is quite low so there must be some citation monsters among them. Any ideas?

Notice how some of the big spenders in this area – Japan, Germany, France and Italy – slide down the table when this metric is used. I think this just shows the limitations of trying to use a single figure of merit. It would be interesting to know – although extremely difficult to find out – how these counts relate to the number of people working in space sciences in each country. The UK, for example, is involved in about a third as many publications as the USA but the number of astronomers in the UK must be much less than a third of the corresponding figure for America. It would be interesting to see a proper comparison of all these countries’ investment in this area, both in terms of people and in money…

..which brings me to Andy Lawrence’s recent blog post which reports that the Italian Government is seriously considering closing down the INAF (Italy’s National Institute for Astrophysics). What this means for astronomy and astrophysics funding in Italy I don’t know. INAF has only existed since 2002 anyway, so it could just mean an expensive bureaucracy will be dismantled and things will go back to the way they were before then. On the other hand, it could be far worse than that and since Berlusconi is involved it probably will be.

Those in control of the astronomy budget in this country have also made it clear that they think there are too many astronomers in the UK, although the basis for this decision escapes me.  Recent deep cuts in grant funding have already convinced some British astronomers to go abroad. With more cuts probably on the way, this exodus is bound to accelerate. I suspect those that leave  won’t be going to Italy, but I agree with Andy Fabian that it’s very difficult to see how the UK will be able to hold  its excellent position in the world rankings for much longer.

Advertisements

9 Responses to “Turning the Tables”

  1. An economist once told me that there exists some law in their discipline that states that once one uses an observable as a target the observable stops being useful as an observable. This is largely a statement of common sense but its worth our while to remember this when we look at citation figures.

    I have a few citation monsters on my CV, some I’ve actively contributed to, others I barely even read. However, the paper I’m most proud of, and which most represented innovative science (in my opinion), has collected about six citations.

    Those at the top need to realise how good science works and recognise the usefulness and limitations of citation figures in assessing the worth (real and potential) of a research program.

    Regarding the astronomy cull, I’ve seen STFC graphs of the number of postdocs as a function of time which supposedly demonstrate the overheating of the astronomy sector in the good years. I’m not in a position to judge whether this is actually the case. However, the way the data were presented implied there was a problem of sustainability. Then again, these graphs came from the STFC…..

  2. telescoper Says:

    Citation counts obviously contain some information, but it’s daft to think that they can tell you everything. They’re certainly not a reliable measure of “quality” (whatever that means); they relate more to “impact” (whatever that means). My highest-cited papers are not the ones I regard as the best, but they are the ones that most people seem to find useful.

    The balance between facilities and the human investment needed to exploit them is not at all easy to get right, but it’s so far out of kilter at the moment that it’s ridiculous. In my opinion we should have ditched more facilities and use the money to at least try do some things well, rather than having lots of shiny kit with nobody to do the science.

  3. […] This post was mentioned on Twitter by Paul Crowther, Peter Coles. Peter Coles said: Turning the Tables: http://wp.me/pko9D-1z5 […]

  4. I don’t know, but my suspicion is, since Hungary is not a big country with respect to space science, that most of the papers involve international collaborations and as such have more citations for the same reasons that all international-collaboration papers have. Maybe the same holds for Scotland.

  5. telescoper Says:

    For a country with a population of around 5 million, Scotland has quite a number of largish groups, including Edinburgh, St Andrews and Glasgow. I don’t know, but I doubt whether Scottish astronomers are more involved in international projects than their English (or Welsh) equivalents.

    Incidentally, Andy Fabian includes in his article an argument that research funds should be concentrated in a small number of large groups, presumably including Cambridge. I don’t agree with this, because not everyone who’s good is in a big group and not everyone who’s in a big group is good. I’d suggest that good people in small groups merit funding much more than mediocre ones who happen to be in big departments. Funding should be based on scientific merit. But how to measure that?

  6. One can (and some countries do) make the following argument: give more money to the less-productive groups. The idea is that they can use the money for improvement. If one gives it to the best groups, then they just get even better, and get more money next time as well. (Of course, if a bad group doesn’t improve as a result of having more money, it shouldn’t continue to collect.)

    Hand-in-hand with allocating money on the basis of “quality” is the role of rankings. Two things many people don’t realise: first, only ten things can be in the top ten; second, with enough attention to detail, one can always say “x is better than y”; more important are absolute criteria for quality rather than a ranking.

  7. Bring on the exodus – we have just received a new scheme for hiring astronomers here in Oz – the Super Science Fellowships – and there have been a rash of adds for the first round (July 2010) and there will be a another round of positions (approved and funded) in July 2011.

    The downside is that the applicants have to be within 3 years of PhD, but I hope we can cream off some of those PhDs the UK is producing but have no where to go 🙂

  8. telescoper Says:

    Could they change to within 30 years of a PhD?

  9. Woken Postdoc Says:

    Whenever I’m mingling with good students and younger postdocs, I urge them (strenuously) to join the exodus. For those UK-born, now is a very good time to seek broader, international experience. I wince to watch the overstayers: they like a queue of ducklings crossing the M25.

    The problem is not just STFC’s budget hole; it’s a degenerately top-down, anti-scientific, anti-creative, industrial mentality. Even before the merger, signs of “gambler’s fallacy” popped up in PPARC rhetoric. Once upon a time, there were instrument-motivated posts (PLS), and there were scientifically-motivated posts (on grants and fellowships). The latter were all about addressing the great mysteries of the universe, pursuing a personal thesis using whatever tools the individual research deems appropriate. Nowadays, all posts have effectively been degraded into PLS. The STFC explicitly evaluates fellowships and grants for narrow, immediate relevance to its facilities. The attitude seems to be: “We bought all these big machines. Now give the machines some slaves. We’ll concoct a scientific motivation later.” They’ve put the cart before the horse. The craftsman has become subordinate to his tools.

    What outcomes were the chosen bibliometrics designed to favour? Who are the people who preferentially survive the cuts? Trendy backflippers who switch their allegiance from one satellite/facility to another every three years. Lemmings. Bandwagon passengers and passive networkers who ride on massive consortia aimed at mass-producing repetitive, impersonal, 50-author papers. What happens to the independent thinkers, writing good but infrequent two-author papers in hard theory and insightful observation? We all see the trend. Too many of us buckle, and merge into the machine. Watch your younger colleagues as they gradually lose spark and individuality. Warn them to flee.

    Like Cronus, the STFC is eating his young.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: