Archive for risk

Fear, Risk, Uncertainty and the European Union

Posted in Politics, Science Politics, The Universe and Stuff with tags , , , , , , , , , on April 11, 2016 by telescoper

I’ve been far too busy with work and other things to contribute as much as I’d like to the ongoing debate about the forthcoming referendum on Britain’s membership of the European Union. Hopefully I’ll get time for a few posts before June 23rd, which is when the United Kingdom goes to the polls.

For the time being, however, I’ll just make a quick comment about one phrase that is being bandied about in this context, namely Project Fear.As far as I am aware this expression first came up in the context of last year’s referendum on Scottish independence, but it’s now being used by the “leave” campaign to describe some of the arguments used by the “remain” campaign. I’ve met this phrase myself rather often on social media such as Twitter, usually in use by a BrExit campaigner accusing me of scaremongering because I think there’s a significant probability that leaving the EU will cause the UK serious economic problems.

Can I prove that this is the case? No, of course not. Nobody will know unless and until we try leaving the EU. But my point is that there’s definitely a risk. It seems to me grossly irresponsible to argue – as some clearly are doing – that there is no risk at all.

This is all very interesting for those of us who work in university science departments because “Risk Assessments” are one of the things we teach our students to do as a matter of routine, especially in advance of experimental projects. In case you weren’t aware, a risk assessment is

…. a systematic examination of a task, job or process that you carry out at work for the purpose of; Identifying the significant hazards that are present (a hazard is something that has the potential to cause someone harm or ill health).

Perhaps we should change the name of our “Project Risk Assessments” to “Project Fear”?

I think this all demonstrates how very bad most people are at thinking rationally about uncertainty, to such an extent that even thinking about potential hazards is verboten. I’ve actually written a book about uncertainty in the physical sciences , partly in an attempt to counter the myth that science deals with absolute certainties. And if physics doesn’t, economics definitely can’t.

In this context it is perhaps worth mentioning the  definitions of “uncertainty” and “risk” suggested by Frank Hyneman Knight in a book on economics called Risk, Uncertainty and Profit which seem to be in standard use in the social sciences.  The distinction made there is that “risk” is “randomness” with “knowable probabilities”, whereas “uncertainty” involves “randomness” with “unknowable probabilities”.

I don’t like these definitions at all. For one thing they both involve a reference to “randomness”, a word which I don’t know how to define anyway; I’d be much happier to use “unpredictability”.In the context of BrExit there is unpredictability because we don’t have any hard information on which to base a prediction. Even more importantly, perhaps, I find the distinction between “knowable” and “unknowable” probabilities very problematic. One always knows something about a probability distribution, even if that something means that the distribution has to be very broad. And in any case these definitions imply that the probabilities concerned are “out there”, rather being statements about a state of knowledge (or lack thereof). Sometimes we know what we know and sometimes we don’t, but there are more than two possibilities. As the great American philosopher and social scientist Donald Rumsfeld (Shurely Shome Mishtake? Ed) put it:

“…as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know.”

There may be a proper Bayesian formulation of the distinction between “risk” and “uncertainty” that involves a transition between prior-dominated (uncertain) and posterior-dominated (risky), but basically I don’t see any qualititative difference between the two from such a perspective.

When it comes to the EU referendum is that probabilities of different outcomes are difficult to calculate because of the complexity of economics generally and the dynamics of trade within and beyond the European Union in particular. Moreover, probabilities need to be updated using quantitative evidence and we don’t actually have any of that. But it seems absurd to try to argue that there is neither any risk nor any uncertainty. Frankly, anyone who argues this is just being irrational.

Whether a risk is worth taking depends on the likely profit. Nobody has convinced me that the country as a whole will gain anything concrete if we leave the European Union, so the risk seems pointless. Cui Bono? I think you’ll find the answer to that among the hedge fund managers who are bankrolling the BrExit campaign…

 

 

Advertisements

Uncertainty, Risk and Probability

Posted in Bad Statistics, Science Politics with tags , , , , , , , , on March 2, 2015 by telescoper

Last week I attended a very interesting event on the Sussex University campus, the Annual Marie Jahoda Lecture which was given this year by Prof. Helga Nowotny a distinguished social scientist. The title of the talk was A social scientist in the land of scientific promise and the abstract was as follows:

Promises are a means of bringing the future into the present. Nowhere is this insight by Hannah Arendt more applicable than in science. Research is a long and inherently uncertain process. The question is open which of the multiple possible, probable or preferred futures will be actualized. Yet, scientific promises, vague as they may be, constitute a crucial link in the relationship between science and society. They form the core of the metaphorical ‘contract’ in which support for science is stipulated in exchange for the benefits that science will bring to the well-being and wealth of society. At present, the trend is to formalize scientific promises through impact assessment and measurement. Against this background, I will present three case studies from the life sciences: assisted reproductive technologies, stem cell research and the pending promise of personalized medicine. I will explore the uncertainty of promises as well as the cunning of uncertainty at work.

It was a fascinating and wide-ranging lecture that touched on many themes. I won’t try to comment on all of them, but just pick up on a couple that struck me from my own perspective as a physicist. One was the increasing aversion to risk demonstrated by research funding agencies, such as the European Research Council which she helped set up but described in the lecture as “a clash between a culture of trust and a culture of control”. This will ring true to any scientist applying for grants even in “blue skies” disciplines such as astronomy: we tend to trust our peers, who have some control over funding decisions, but the machinery of control from above gets stronger every day. Milestones and deliverables are everything. Sometimes I think in order to get funding you have to be so confident of the outcomes of your research to that you have to have already done it, in which case funding isn’t even necessary. The importance of extremely speculative research is rarely recognized, although that is where there is the greatest potential for truly revolutionary breakthroughs.

Another theme that struck me was the role of uncertainty and risk. This grabbed my attention because I’ve actually written a book about uncertainty in the physical sciences. In her lecture, Prof. Nowotny referred to the definition (which was quite new to me) of these two terms by Frank Hyneman Knight in a book on economics called Risk, Uncertainty and Profit. The distinction made there is that “risk” is “randomness” with “knowable probabilities”, whereas “uncertainty” involves “randomness” with “unknowable probabilities”. I don’t like these definitions at all. For one thing they both involve a reference to “randomness”, a word which I don’t know how to define anyway; I’d be much happier to use “unpredictability”. Even more importantly, perhaps, I find the distinction between “knowable” and “unknowable” probabilities very problematic. One always knows something about a probability distribution, even if that something means that the distribution has to be very broad. And in any case these definitions imply that the probabilities concerned are “out there”, rather being statements about a state of knowledge (or lack thereof). Sometimes we know what we know and sometimes we don’t, but there are more than two possibilities. As the great American philosopher and social scientist Donald Rumsfeld (Shurely Shome Mishtake? Ed) put it:

“…as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know.”

There may be a proper Bayesian formulation of the distinction between “risk” and “uncertainty” that involves a transition between prior-dominated (uncertain) and posterior-dominated (risky), but basically I don’t see any qualititative difference between the two from such a perspective.

Anyway, it was a very interesting lecture that differed from many talks I’ve attended about the sociology of science in that the speaker clearly understood a lot about how science actually works. The Director of the Science Policy Research Unit invited the Heads of the Science Schools (including myself) to dinner with the speaker afterwards, and that led to the generation of many interesting ideas about how we (I mean scientists and social scientists) might work better together in the future, something we really need to do.