Archive for HEFCE

The South-East Physics Network – The Sequel

Posted in Education, Science Politics with tags , , , , , , , , , , , , , on June 18, 2013 by telescoper

Every now and again I’m at a loss for something to blog about when a nice press release comes to the rescue. This announcement has just gone live, and I make no apology for repeating it here!

 

UPDATE: You can now read the University of Sussex take on this announcement here.

–0–

SEPnet_extratext_black

New Investment in Physics Teaching and Research in South East England

The South East Physics network (SEPnet) and HEFCE are delighted to announce their plans to invest £13.1 million pounds to sustain physics undergraduate and postgraduate teaching provision, and world class research facilities, staff and doctoral training over the 5 years up to 2018. HEFCE will provide £2.75 million to maintain and expand the network, to establish a dedicated regional graduate training programme for physics postgraduate students and address physics specific issues of student participation and diversity. On top of the HEFCE contribution, each SEPnet partner will support and fund programmes of Outreach, Employability and Research.

The South East Physics Network (SEPnet) was formed after receiving a £12.5 million grant from HEFCE in 2008 as a network of six Physics departments in South East England at the Universities of Kent, Queen Mary University of London, Royal Holloway University of London, Southampton, Surrey and Sussex. The Science and Technology Facilities Council and Rutherford Appleton Laboratory provided additional funds and resources for collaborations in particle physics and astrophysics. The University of Portsmouth joined in 2010. The Open University and the University of Hertfordshire will join the network effective the 1st August 2013.

SEPnet Phase One has been tremendously successful for the partners in SEPnet and for physics in the region. The Outreach programme, regarded as an exemplar for collaborative outreach, uses the combined knowledge and resources of each partner to provide greater impact and reach and demonstrates that the whole is greater than the sum of its parts. It has succeeded in effectively exploiting the growing national interest in physics through its wide range of public engagement and schools activities. There has been a substantial increase in applications and intake for physics undergraduate courses and undergraduate numbers are now 90% higher in the SEPnet physics departments compared with 2007 and applications up approximately 115% – well above national trends.

Announcing the investment, SEPnet’s Independent Chair Professor Sir William Wakeham said “This is a major success for physics both in the region and nationally. HEFCE’s contribution via SEPnet has enabled the partners in the consortium to grow and develop their physics departments for the long term. Before SEPnet, physics departments had falling student numbers and lacked research diversity. Now they are robust and sustainable and the SEPnet consortium is an exemplar of collaboration in Higher Education.”

David Sweeney, Director of Research, Innovation and Skills, HEFCE said: “We are delighted to see the fruits of a very successful intervention to support what was once a vulnerable subject. HEFCE are pleased to provide funding for a new phase, particularly to address new challenges in the field of postgraduate training and widening participation. The expansion to include new physics departments is a testament to the success of the network and can only act to strengthen and diversify the collaboration.”

Sir Peter Knight, President of the Institute of Physics, expressed strong support for the government’s continued investments in the sciences generally and in physics specifically. “SEPnet has been an undoubted success in sustaining physics in the South East region and has strongly participated in contributing to its beneficial effects nationally. It is an exemplar of collaborative best practice in outreach, employability and research and we now look forward to collaborating in the critical areas of graduate training, public engagement and diversity.”

The specific programmes already being developed by the network include:

  1. a regional Graduate Network built on the strength of current SEPnet research collaborations and graduate training whose  primary objectives  will be to:
  • develop and deliver an exemplar programme of PhD transferable and leadership skills training delivered flexibly to create employment-ready physics doctoral graduates for the economic benefit of the UK;
  • increase employer engagement with HEIs including PhD internships,  industrially-sponsored  studentships and Knowledge Transfer fellowships;
  • enhance the impact  of SEPnet’s research via a clear, collaborative impact strategy;
  • enhance research environment diversity through engagement with Athena SWAN and the IoP’s Project Juno.
  1. Expansion of its employer engagement and internship programmes, widening the range of work experiences available to enhance undergraduate (UG) and postgraduate (PG) employability and progress to research degrees.
  2. Enhancement of its Outreach Programme  to deliver and disseminate  best practice in schools and public engagement and  increase diversity in  physics education.

The inclusion of new partners The Open University and University of Hertfordshire broadens the range of teaching and postgraduate research in the network. The University of Reading, about to introduce an undergraduate programme in Environmental Physics (Department of Meteorology), will join as an associate partner.

A key part of the contributions from each partner is the provision of “SEPnet PhD Studentships”, a programme to attract the brightest and best physics graduates to engage in a programme of collaborative research within the network, of joint supervision and with a broad technical and professional graduate training programme within the SEPnet Graduate Network.

The network will be led by the University of Southampton. Its Vice-Chancellor, Professor Don Nutbeam: “I am delighted that the University of Southampton, in partnership with nine other universities in the region, is able to build on the success of the SEPnet initiative to reinvigorate the university physics teaching and research and take it to a new level in the turbulent period ahead for the higher education sector. The SEPnet training programme brings novelty, quality and diversity to the regions physics postgraduates that we expect to be a model for other regions and subjects.”

Counting for the REF

Posted in Open Access, Science Politics with tags , , , , , , on April 20, 2013 by telescoper

It’s a lovely day in Brighton and I’m once again on campus for an Admissions Event at Sussex University, this time for the Mathematics Department in the School of Mathematical and Physical Sciences.  After all the terrible weather we’ve had since I arrived in February, it’s a delight and a relief to see the campus at its best for today’s crowds. Anyway, now that I’ve finished my talk and the subsequent chats with prospective students and their guests I thought I’d do a quick blogette before heading back home and preparing for this evenings Physics & Astronomy Ball. It’s all go around here.

What I want to do first of all is to draw attention to a very nice blog post by a certain Professor Moriarty who, in case you did not realise it, dragged himself away from his hiding place beneath the Reichenbach Falls and started a new life as Professor of Physics at Nottingham University.  Phil Moriarty’s piece basically argues that the only way to really judge the quality of a scientific publication is not by looking at where it is published, but by peer review (i.e. by getting knowledgeable people to read it). This isn’t a controversial point of view, but it does run counter to the current mania for dubious bibliometric indicators, such as journal impact factors and citation counts.

The forthcoming Research Excellence Framework involves an assessment of the research that has been carried out in UK universities over the past five years or so, and a major part of the REF will be the assessment of up to four “outputs” submitted by research-active members of staff over the relevant period (from 2008 to 2013). reading Phil’s piece might persuade you to be happy that the assessment of the research outputs involved in the REF will be primarily based on peer review. If you are then I suggest you read on because, as I have blogged about before, although peer review is fine in principle, the way that it will be implemented as part of the REF has me deeply worried.

The first problem arises from the scale of the task facing members of the panel undertaking this assessment. Each research active member of staff is requested to submit four research publications (“outputs”) to the panel, and we are told that each of these will be read by at least two panel members. The panel comprises 20 members.

As a rough guess let’s assume that the UK has about 40 Physics departments, and the average number of research-active staff in each is probably about 40. That gives about 1600 individuals for the REF. Actually the number of category A staff submitted to the 2008 RAE was 1,685.57 FTE (Full-Time Equivalent), pretty  close to this figure. At 4 outputs per person that gives 6400 papers to be read. We’re told that each will be read by at least two members of the panel, so that gives an overall job size of 12800 paper-readings. There is some uncertainty in these figures because (a) there is plenty of evidence that departments are going to be more selective in who is entered than was the case in 2008 and (b) some departments have increased their staff numbers significantly since 2008. These two factors work in opposite directions so not knowing the size of either it seems sensible to go with the numbers from the previous round for the purposes of my argument.

There are 20 members of the panel so 6400 papers submitted means that, between 29th November 2013 (the deadline for submissions) and the announcement of the results in December 2014 each member of the panel will have to have read 640 research papers. That’s an average of about two a day…

It is therefore blindingly obvious that whatever the panel does do will not be a thorough peer review of each paper, equivalent to refereeing it for publication in a journal. The panel members simply won’t have the time to do what the REF administrators claim they will do. We will be lucky if they manage a quick skim of each paper before moving on. In other words, it’s a sham.

Now we are also told the panel will use their expert judgment to decide which outputs belong to the following categories:

  • 4*  World Leading
  • 3* Internationally Excellent
  • 2* Internationally Recognized
  • 1* Nationally Recognized
  • U   Unclassified

There is an expectation that the so-called QR  funding allocated as a result of the 2013 REF will be heavily weighted towards 4*, with perhaps a small allocation to 3* and probably nothing at all for lower grades. The word on the street is that the weighting for 4* will be 9 and that for 3* only 1. “Internationally recognized”  will be regarded as worthless in the view of HEFCE. Will the papers belonging to the category “Not really understood by the panel member” suffer the same fate?

The panel members will apparently know enough about every single one of the papers they are going to read in order to place them  into one of the above categories, especially the crucial ones “world-leading” or “internationally excellent”, both of which are obviously defined in a completely transparent and objective manner. Not. The steep increase in weighting between 3* and 4* means that this judgment could mean a drop of funding that could spell closure for a department.

We are told that after forming this judgement based on their expertise the panel members will “check” the citation information for the papers. This will be done using the SCOPUS service provided (no doubt at considerable cost) by   Elsevier, which by sheer coincidence also happens to be a purveyor of ridiculously overpriced academic journals.  No doubt Elsevier are  on a nice little earner peddling meaningless data for the HECFE bean-counters, but I have no confidence that they will add any value to the assessment process.

There have been high-profile statements to the effect that the REF will take no account of where the relevant “outputs”  are published, including a pronouncement by David Willetts. On the face of it, that would suggest that a paper published in the spirit of Open Access in a free archive would not be disadvantaged. However, I very much doubt that will be the case.

I think if you look at the volume of work facing the REF panel members it’s pretty clear that citation statistics will be much more important for the Physics panel than we’ve been led to believe. The panel simply won’t have the time or the breadth of understanding to do an in-depth assessment of every paper, so will inevitably in many cases be led by bibliometric information. The fact that SCOPUS doesn’t cover the arXiv means that citation information will be entirely missing from papers just published there.

The involvement of  a company like Elsevier in this system just demonstrates the extent to which the machinery of research assessment is driven by the academic publishing industry. The REF is now pretty much the only reason why we have to use traditional journals. It would be better for research, better for public accountability and better economically if we all published our research free of charge in open archives. It wouldn’t be good for academic publishing houses, however, so they’re naturally very keen to keep things just the way they are. The saddest thing is that we’re all so cowed by the system that we see no alternative but to participate in this scam.

Incidentally we were told before the 2008 Research Assessment Exercise that citation data would emphatically not be used;  we were also told afterwards that citation data had been used by the Physics panel. That’s just one of the reasons why I’m very sceptical about the veracity of some of the pronouncements coming out from the REF establishment. Who knows what they actually do behind closed doors?  All the documentation is shredded after the results are published. Who can trust such a system?

To put it bluntly, the apparatus of research assessment has done what most bureaucracies eventually do; it has become  entirely self-serving. It is imposing increasingly  ridiculous administrative burdens on researchers, inventing increasingly  arbitrary assessment criteria and wasting increasing amounts of money on red tape which should actually be going to fund research.

And that’s all just about “outputs”. I haven’t even started on “impact”….

Critical Masses

Posted in Education, Science Politics with tags , , , , , , , on January 26, 2013 by telescoper

One of the interesting bits of news floating around academia at the moment is the announcement that my current employer (until the end of next week), Cardiff University is to join forces with the Universities of Bath, Exeter and Bristol in an alliance intended to create a ‘critical mass of knowledge’ and help Cardiff  ‘better compete for more research income’ (apparently by pretending to be in England rather than in Wales).  How successful this will be – or even what form this alliance will take – remains to be seen.

There’s been a lot of gossip about what inspired this move, but it’s not the first attempt to create a collaborative bloc of this kind. Last year five universities from the Midlands announced plans to do something similar. The “M5″ group of   Birmingham, Leicester, Loughborough, Nottingham and Warwick got together primarily to share infrastructure in order to help them win grants, which is probably what also lies behind the Cardiff-Bath-Exeter-Bristol deal.

Of course there are also a myriad  alliances at the level of individual Schools and Departments. I’ll shortly be joining the University of Sussex, which is a major player in SEPNET – the South-East Physics Physics Network which was set up with help from HEFCE There are other such networks in England, as well as SUPA in Scotland, funded by the devolved Scottish Funding Council. Attempts to form a similar arrangement for Physics in Wales were given short shrift by the Welsh Funding Agency, HEFCW. The inability or unwillingness of HEFCW to properly engage with research in Wales is no doubt behind Cardiff’s decision to seek alliances with English universities but I wonder how it will translate into funding. Surely HEFCE wouldn’t be allowed to fund a Welsh University, so presumably this is more aimed at funding from the research councils or further afield, perhaps in Europe. Or perhaps the idea is that if GW4 can persuade HEFCE to fund Bath, Bristol and Exeter, HEFCW will be shamed into stumping up something for Cardiff? Sneaky.

Anyway, good luck to the new “GW4″ alliance. Although I’m moving to pastures new I’ll certainly keep an eye on any developments, and hope that they’re positive. The only thing that really disturbs me is that the name “Great Western Four” is apparently inspired by the Great Western Railway, now run by an outfit called First Great Western. My recent experiences of travelling on that have left a lot to be desired and I’m sure the name will have negative connotations in the minds of many who are fed up of their unreliable, overcrowded, overpriced and poorly managed services. They say a rose by any other name would smell as sweet, but so far this is only a name – and one with a distinctly questionable odour.

REF moves the goalposts (again)

Posted in Bad Statistics, Education, Science Politics with tags , , , on January 18, 2013 by telescoper

The topic of the dreaded 2014 Research Excellence Framework came up quite a few times in quite a few different contexts over the last few days, which reminded me that I should comment on a news item that appeared a week or so ago.

As you may or may not be aware, the REF is meant to assess the excellence of university departments in various disciplines and distribute its “QR” research funding accordingly.  Institutions complete submissions which include details of relevant publications etc and then a panel sits in judgement. I’ve already blogged of all this: the panels clearly won’t have time to read every paper submitted in any detail at all, so the outcome is likely to be highly subjective. Moreover, HEFCE’s insane policy to award the bulk of its research funds to only the very highest grade (4* – “internationally excellent”) means that small variations in judged quality will turn into enormous discrepancies in the level of research funding. The whole thing is madness, but there seems no way to inject sanity into the process as the deadline for submissions remorselessly approaches.

Now another wrinkle has appeared on the already furrowed brows of those preparing REF submissions. The system allows departments to select staff to be entered; it’s not necessary for everyone to go in. Indeed if only the very best researchers are entered then the typical score for the department will be high, so it will appear  higher up  in the league tables, and since the cash goes primarily to the top dogs then this might produce almost as much money as including a few less highly rated researchers.

On the other hand, this is a slightly dangerous strategy because it presupposes that one can predict which researchers and what research will be awarded the highest grade. A department will come a cropper if all its high fliers are deemed by the REF panels to be turkeys.

In Wales there’s something that makes this whole system even more absurd, which is that it’s almost certain that there will be no QR funding at all. Welsh universities are spending millions preparing for the REF despite the fact that they’ll get no money even if they do stunningly well. The incentive in Wales is therefore even stronger than it is in England to submit only the high-fliers, as it’s only the position in the league tables that will count.

The problem with a department adopting the strategy of being very selective is that it could have a very  negative effect on the career development of younger researchers if they are not included in their departments REF submission. As well as taking the risk that people who manage to convince their Head of School that they are bound to get four stars in the REF may not have the same success with the various grey eminences who make the decision that really matters.

Previous incarnations of the REF (namely the Research Assessment Exercises of 2008 and 2001) did not publish explicit information about exactly how many eligible staff were omitted from the submissions, largely because departments were extremely creative in finding ways of hiding staff they didn’t want to include.

Now however it appears there are plans that the Higher Education Statistics Agency (HESA) will publish its own figures on how many staff it thinks are eligible for inclusion in each department. I’m not sure how accurate these figures will be but they will change the game, in that they will allow compilers of league tables to draw up lists of the departments that prefer playing games to   just allowing the REF panels to  judge the quality of their research.

I wonder how many universities are hastily revising their submission plans in the light of this new twist?

Reffing Madness

Posted in Science Politics with tags , , , , , , , , , , on June 30, 2012 by telescoper

I’m motivated to make a quick post in order to direct you to a blog post by David Colquhoun that describes the horrendous behaviour of the management at Queen Mary, University of London in response to the Research Excellence Framework. It seems that wholesale sackings are in the pipeline there as a result of a management strategy to improve the institution’s standing in the league tables by “restructuring” some departments.

To call this strategy “flawed” would be the understatement of the year. Idiotic is a far better word.  The main problem being that the criteria being applied to retain or dismiss staff bear no obvious relation to those adopted by the REF panels. To make matters worse, Queen Mary has charged two of its own academics with “gross misconduct” for having the temerity to point out the stupidity of its management’s behaviour. Read on here for more details.

With the deadline for REF submissions fast approaching, it’s probably the case that many UK universities are going into panic mode, attempting to boost their REF score by shedding staff perceived to be insufficiently excellent in research and/or  luring  in research “stars” from elsewhere. Draconian though the QMUL approach may seem, I fear it will be repeated across the sector.  Clueless university managers are trying to guess what the REF panels will think of their submissions by staging mock assessments involving external experts. The problem is that nobody knows what the actual REF panels will do, except that if the last Research Assessment Exercise is anything to go by, what they do will be nothing like what they said they would do.

Nowhere is the situation more absurd than here in Wales. The purported aim of the REF is to allocated the so-called “QR” research funding to universities. However, it is an open secret that in Wales there simply isn’t going to be any QR money at all. Leighton Andrews has stripped the Higher Education budget bare in order to pay for his policy of encouraging Welsh students to study in England by paying their fees there.

So here we have to enter the game, do the mock assessments, write our meaningless “impact” cases, and jump through all manner of pointless hoops, with the inevitable result that even if we do well we’ll get absolutely no QR money at the end of it. The only strategy that makes sense for Welsh HEIs such as Cardiff University, where I work, is to submit only those researchers guaranteed to score highly. That way at least we’ll do better in the league tables. It won’t matter how many staff actually get submitted, as the multiplier is zero.

There’s no logical argument why Welsh universities should be in the REF at all, given that there’s no reward at the end. But we’re told we have to by the powers that be. Everyone’s playing games in which nobody knows the rules but in which the stakes are people’s careers. It’s madness.

I can’t put it better than this quote:

These managers worry me. Too many are modest achievers, retired from their own studies, intoxicated with jargon, delusional about corporate status and forever banging the metrics gong. Crucially, they don’t lead by example.

Any reader of this blog who works in a university will recognize the sentiments expressed there. But let’s not blame it all on the managers. They’re doing stupid things because the government has set up a stupid framework. There isn’t a single politician in either England or Wales with the courage to do the right thing, i.e. to admit the error and call the whole thing off.

The Transparent Dishonesty of the Research Excellence Framework

Posted in Open Access, Science Politics with tags , , , , , , on May 30, 2012 by telescoper

Some of my colleagues in the School of Physics & Astronomy recently attended a briefing session about the  forthcoming Research Excellence Framework. This, together with the post I reblogged earlier this morning, suggested that I should re-hash an article I wrote some time ago about the arithmetic of the REF, and how it will clearly not do what it says on the tin.

The first thing is the scale of the task facing members of the panel undertaking the assessment. Every research active member of staff in every University in the UK is requested to submit four research publications (“outputs”) to the panel, and we are told that each of these will be read by at least two panel members. The Physics panel comprises 20 members.

As a rough guess I’d say that the UK has about 40 Physics departments, and the average number of research-active staff in each is probably about 40. That gives about 1600 individuals for the REF. Actually the number of category A staff submitted to the 2008 RAE was 1,685.57 FTE (Full-Time Equivalent), pretty close to this figure. At 4 outputs per person that gives 6400 papers to be read. We’re told that each will be read by at least two members of the panel, so that gives an overall job size of 12800 paper-readings. There are 20 members of the panel, so that means that between 29th November 2013 (the deadline for submissions) and the announcement of the results in December 2014 each member of the panel will have to have read 640 research papers. That’s an average of about two a day. Every day. Weekends included.

Now we are told the panel will use their expert judgment to decide which outputs belong to the following categories:

  • 4*  World Leading
  • 3* Internationally Excellent
  • 2* Internationally Recognized
  • 1* Nationally Recognized
  • U   Unclassified

There is an expectation that the so-called QR  funding allocated as a result of the 2013 REF will be heavily weighted towards 4*, with perhaps a small allocation to 3* and probably nothing at all for lower grades. In other words “Internationally recognized” research will probably be deemed completely worthless by HEFCE. Will the papers belonging to the category “Not really understood by the panel member” suffer the same fate?

The panel members will apparently know enough about every single one of the papers they are going to read in order to place them  into one of the above categories, especially the crucial ones “world-leading” or “internationally excellent”, both of which are obviously defined in a completely transparent and objective manner. Not.

We are told that after forming this judgement based on their expertise the panel members will “check” the citation information for the papers. This will be done using the SCOPUS service provided (no doubt at considerable cost) by   Elsevier, which by sheer coincidence also happens to be a purveyor of ridiculously overpriced academic journals. No doubt Elsevier are  on a nice little earner peddling meaningless data for the HECFE bean-counters, but I haven’t any confidence that it will add much value to the assessment process.

There have been high-profile statements to the effect that the REF will take no account of where the relevant “outputs”  are published, including a recent pronouncement by David Willetts. On the face of it, that would suggest that a paper published in the spirit of Open Access in a free archive would not be disadvantaged. However, I very much doubt that will be the case.

I think if you look at the volume of work facing the REF panel members it’s pretty clear that citation statistics will be much more important for the Physics panel than we’ve been led to believe. The panel simply won’t have the time or the breadth of understanding to do an in-depth assessment of every paper, so will inevitably in many cases be led by bibliometric information. The fact that SCOPUS doesn’t cover the arXiv means that citation information will be entirely missing from papers just published there.

The involvement of  a company like Elsevier in this system just demonstrates the extent to which the machinery of research assessment is driven by the academic publishing industry. The REF is now pretty much the only reason why we have to use traditional journals. It would be better for research, better for public accountability and better economically if we all published our research free of charge in open archives. It wouldn’t be good for academic publishing houses, however, so they’re naturally very keen to keep things just the way they are. The saddest thing is that we’re all so cowed by the system that we see no alternative but to participate in this scam.

Incidentally we were told before the 2008 Research Assessment Exercise that citation data would emphatically not be used;  we were also told afterwards that citation data had been used by the Physics panel. That’s just one of the reasons why I’m very sceptical about the veracity of some of the pronouncements coming out from the REF establishment. Who knows what they actually do behind closed doors?  All the documentation is shredded after the results are published. Who can trust such a system?

To put it bluntly, the apparatus of research assessment has done what most bureaucracies eventually do; it has become  entirely self-serving. It is imposing increasingly  ridiculous administrative burdens on researchers, inventing increasingly  arbitrary assessment criteria and wasting increasing amounts of money on red tape which should actually be going to fund research.

Is this the “Squeezed Middle”?

Posted in Education, Finance with tags , , on March 29, 2012 by telescoper

As reported in the Times Higher , the Higher Education Funding Council for England (HEFCE) has announced its allocations to English Higher Education Institutions for 2012/13. As expected, many universities are receiving substantial cuts next year. Here is a table of the biggest losers:

The Times Higher article describes this as the “Squeezed Middle”. It looks more like the “Squeezed Bottom” to me, but then I suppose that would have made an inappropriate headline.

Is there really a University of Sunderland?

Anyways, this allows me the chance to congratulate the former Director of Learning and Teaching in the School of Physics & Astronomy at Cardiff University on his move to the University of Central Lancashire, currently riding high at Number 7 in the above table…

 

Follow

Get every new post delivered to your Inbox.

Join 4,200 other followers