Archive for Cosmology

The Dark Matter of Astronomy Hype

Posted in Astrohype, Bad Statistics, The Universe and Stuff with tags , , , , on April 16, 2018 by telescoper

Just before Easter (and, perhaps more significantly, just before April Fool’s Day) a paper by van Dokkum et al. was published in Nature with the title A Galaxy Lacking Dark Matter. As is often the case with scientific publications presented in Nature, the press machine kicked into action and stories about this mysterious galaxy appeared in print and online all round the world.

So what was the result? Here’s the abstract of the Nature paper:

 

Studies of galaxy surveys in the context of the cold dark matter paradigm have shown that the mass of the dark matter halo and the total stellar mass are coupled through a function that varies smoothly with mass. Their average ratio Mhalo/Mstars has a minimum of about 30 for galaxies with stellar masses near that of the Milky Way (approximately 5 × 1010 solar masses) and increases both towards lower masses and towards higher masses. The scatter in this relation is not well known; it is generally thought to be less than a factor of two for massive galaxies but much larger for dwarf galaxies. Here we report the radial velocities of ten luminous globular-cluster-like objects in the ultra-diffuse galaxy NGC1052–DF2, which has a stellar mass of approximately 2 × 108 solar masses. We infer that its velocity dispersion is less than 10.5 kilometres per second with 90 per cent confidence, and we determine from this that its total mass within a radius of 7.6 kiloparsecs is less than 3.4 × 108 solar masses. This implies that the ratio Mhalo/Mstars is of order unity (and consistent with zero), a factor of at least 400 lower than expected. NGC1052–DF2 demonstrates that dark matter is not always coupled with baryonic matter on galactic scales.

 

I had a quick look at the paper at the time and wasn’t very impressed by the quality of the data. To see why look at the main plot, a histogram formed from just ten observations (of globular clusters used as velocity tracers):

I didn’t have time to read the paper thoroughly before the Easter weekend,  but did draft a sceptical blog on the paper only to decide not to publish it as I thought it might be too inflammatory even by my standards! Suffice to say that I was unconvinced.

Anyway, it turns out I was far from the only astrophysicist to have doubts about this result; you can find a nice summary of the discussion on social media here and here. Fortunately, people more expert than me have found the time to look in more detail at the Dokkum et al. claim. There’s now a paper on the arXiv by Martin et al.

It was recently proposed that the globular cluster system of the very low surface-brightness galaxy NGC1052-DF2 is dynamically very cold, leading to the conclusion that this dwarf galaxy has little or no dark matter. Here, we show that a robust statistical measure of the velocity dispersion of the tracer globular clusters implies a mundane velocity dispersion and a poorly constrained mass-to-light ratio. Models that include the possibility that some of the tracers are field contaminants do not yield a more constraining inference. We derive only a weak constraint on the mass-to-light ratio of the system within the half-light radius or within the radius of the furthest tracer (M/L_V<8.1 at the 90-percent confidence level). Typical mass-to-light ratios measured for dwarf galaxies of the same stellar mass as NGC1052-DF2 are well within this limit. With this study, we emphasize the need to properly account for measurement uncertainties and to stay as close as possible to the data when determining dynamical masses from very small data sets of tracers.

More information about this system has been posted by Pieter van Dokkum on his website here.

Whatever turns out in the final analysis of NGC1052-DF2 it is undoubtedly an interesting system. It may indeed turn out to  have less dark matter than expected though I don’t think the evidence available right now warrants such an inference with such confidence. What worries me most however, is the way this result was presented in the media, with virtually no regard for the manifest statistical uncertainty inherent in the analysis. This kind of hype can be extremely damaging to science in general, and to explain why I’ll go off on a rant that I’ve indulged in a few times before on this blog.

A few years ago there was an interesting paper  (in Nature of all places), the opening paragraph of which reads:

The past few years have seen a slew of announcements of major discoveries in particle astrophysics and cosmology. The list includes faster-than-light neutrinos; dark-matter particles producing γ-rays; X-rays scattering off nuclei underground; and even evidence in the cosmic microwave background for gravitational waves caused by the rapid inflation of the early Universe. Most of these turned out to be false alarms; and in my view, that is the probable fate of the rest.

The piece went on to berate physicists for being too trigger-happy in claiming discoveries, the BICEP2 fiasco being a prime example. I agree that this is a problem, but it goes far beyond physics. In fact its endemic throughout science. A major cause of it is abuse of statistical reasoning.

Anyway, I thought I’d take the opportunity to re-iterate why I statistics and statistical reasoning are so important to science. In fact, I think they lie at the very core of the scientific method, although I am still surprised how few practising scientists are comfortable with even basic statistical language. A more important problem is the popular impression that science is about facts and absolute truths. It isn’t. It’s a <em>process</em>. In order to advance it has to question itself. Getting this message wrong – whether by error or on purpose -is immensely dangerous.

Statistical reasoning also applies to many facets of everyday life, including business, commerce, transport, the media, and politics. Probability even plays a role in personal relationships, though mostly at a subconscious level. It is a feature of everyday life that science and technology are deeply embedded in every aspect of what we do each day. Science has given us greater levels of comfort, better health care, and a plethora of labour-saving devices. It has also given us unprecedented ability to destroy the environment and each other, whether through accident or design.

Civilized societies face rigorous challenges in this century. We must confront the threat of climate change and forthcoming energy crises. We must find better ways of resolving conflicts peacefully lest nuclear or chemical or even conventional weapons lead us to global catastrophe. We must stop large-scale pollution or systematic destruction of the biosphere that nurtures us. And we must do all of these things without abandoning the many positive things that science has brought us. Abandoning science and rationality by retreating into religious or political fundamentalism would be a catastrophe for humanity.

Unfortunately, recent decades have seen a wholesale breakdown of trust between scientists and the public at large. This is due partly to the deliberate abuse of science for immoral purposes, and partly to the sheer carelessness with which various agencies have exploited scientific discoveries without proper evaluation of the risks involved. The abuse of statistical arguments have undoubtedly contributed to the suspicion with which many individuals view science.

There is an increasing alienation between scientists and the general public. Many fewer students enrol for courses in physics and chemistry than a a few decades ago. Fewer graduates mean fewer qualified science teachers in schools. This is a vicious cycle that threatens our future. It must be broken.

The danger is that the decreasing level of understanding of science in society means that knowledge (as well as its consequent power) becomes concentrated in the minds of a few individuals. This could have dire consequences for the future of our democracy. Even as things stand now, very few Members of Parliament are scientifically literate. How can we expect to control the application of science when the necessary understanding rests with an unelected “priesthood” that is hardly understood by, or represented in, our democratic institutions?

Very few journalists or television producers know enough about science to report sensibly on the latest discoveries or controversies. As a result, important matters that the public needs to know about do not appear at all in the media, or if they do it is in such a garbled fashion that they do more harm than good.

Years ago I used to listen to radio interviews with scientists on the Today programme on BBC Radio 4. I even did such an interview once. It is a deeply frustrating experience. The scientist usually starts by explaining what the discovery is about in the way a scientist should, with careful statements of what is assumed, how the data is interpreted, and what other possible interpretations might be and the likely sources of error. The interviewer then loses patience and asks for a yes or no answer. The scientist tries to continue, but is badgered. Either the interview ends as a row, or the scientist ends up stating a grossly oversimplified version of the story.

Some scientists offer the oversimplified version at the outset, of course, and these are the ones that contribute to the image of scientists as priests. Such individuals often believe in their theories in exactly the same way that some people believe religiously. Not with the conditional and possibly temporary belief that characterizes the scientific method, but with the unquestioning fervour of an unthinking zealot. This approach may pay off for the individual in the short term, in popular esteem and media recognition – but when it goes wrong it is science as a whole that suffers. When a result that has been proclaimed certain is later shown to be false, the result is widespread disillusionment.

The worst example of this tendency that I can think of is the constant use of the phrase “Mind of God” by theoretical physicists to describe fundamental theories. This is not only meaningless but also damaging. As scientists we should know better than to use it. Our theories do not represent absolute truths: they are just the best we can do with the available data and the limited powers of the human mind. We believe in our theories, but only to the extent that we need to accept working hypotheses in order to make progress. Our approach is pragmatic rather than idealistic. We should be humble and avoid making extravagant claims that can’t be justified either theoretically or experimentally.

The more that people get used to the image of “scientist as priest” the more dissatisfied they are with real science. Most of the questions asked of scientists simply can’t be answered with “yes” or “no”. This leaves many with the impression that science is very vague and subjective. The public also tend to lose faith in science when it is unable to come up with quick answers. Science is a process, a way of looking at problems not a list of ready-made answers to impossible problems. Of course it is sometimes vague, but I think it is vague in a rational way and that’s what makes it worthwhile. It is also the reason why science has led to so many objectively measurable advances in our understanding of the World.

I don’t have any easy answers to the question of how to cure this malaise, but do have a few suggestions. It would be easy for a scientist such as myself to blame everything on the media and the education system, but in fact I think the responsibility lies mainly with ourselves. We are usually so obsessed with our own research, and the need to publish specialist papers by the lorry-load in order to advance our own careers that we usually spend very little time explaining what we do to the public or why.

I think every working scientist in the country should be required to spend at least 10% of their time working in schools or with the general media on “outreach”, including writing blogs like this. People in my field – astronomers and cosmologists – do this quite a lot, but these are areas where the public has some empathy with what we do. If only biologists, chemists, nuclear physicists and the rest were viewed in such a friendly light. Doing this sort of thing is not easy, especially when it comes to saying something on the radio that the interviewer does not want to hear. Media training for scientists has been a welcome recent innovation for some branches of science, but most of my colleagues have never had any help at all in this direction.

The second thing that must be done is to improve the dire state of science education in schools. Over the last two decades the national curriculum for British schools has been dumbed down to the point of absurdity. Pupils that leave school at 18 having taken “Advanced Level” physics do so with no useful knowledge of physics at all, even if they have obtained the highest grade. I do not at all blame the students for this; they can only do what they are asked to do. It’s all the fault of the educationalists, who have done the best they can for a long time to convince our young people that science is too hard for them. Science can be difficult, of course, and not everyone will be able to make a career out of it. But that doesn’t mean that it should not be taught properly to those that can take it in. If some students find it is not for them, then so be it. We don’t everyone to be a scientist, but we do need many more people to understand how science really works.

I realise I must sound very gloomy about this, but I do think there are good prospects that the gap between science and society may gradually be healed. The fact that the public distrust scientists leads many of them to question us, which is a very good thing. They should question us and we should be prepared to answer them. If they ask us why, we should be prepared to give reasons. If enough scientists engage in this process then what will emerge is and understanding of the enduring value of science. I don’t just mean through the DVD players and computer games science has given us, but through its cultural impact. It is part of human nature to question our place in the Universe, so science is part of what we are. It gives us purpose. But it also shows us a way of living our lives. Except for a few individuals, the scientific community is tolerant, open, internationally-minded, and imbued with a philosophy of cooperation. It values reason and looks to the future rather than the past. Like anyone else, scientists will always make mistakes, but we can always learn from them. The logic of science may not be infallible, but it’s probably the best logic there is in a world so filled with uncertainty.

 

 

 

Advertisements

Remembering Clover

Posted in Biographical, Science Politics, The Universe and Stuff with tags , , , , , , , on April 10, 2018 by telescoper

I was tidying up some papers in my desk yesterday and came across a clipping dated April 9th 2009, i.e. exactly nine years ago to the day. Amazed by this coincidence, I resolved to post it on here but was unable to work out how to use the new-fangled scanner in the Data Innovation Institute office so had to wait until I could get expert assistance this morning:

Sorry it’s a bit crumpled, but I guess that demonstrates the authenticity of its provenance.

The full story, as it appeared in the print edition of the Western Mail, can also be found online here. By the way it’s me on the stepladder, pretending to know something about astronomical instrumentation.

I wrote at some length about the background to the cancellation of the Clover experiment here. In a nutshell, however, Clover involved the Universities of Cardiff, Oxford, Cambridge and Manchester and was designed to detect the primordial B-mode signal from its vantage point in Chile. The chance to get involved in a high-profile cosmological experiment was one of the reasons I moved to Cardiff from Nottingham almost a decade ago, and I was looking forward to seeing the data arriving for analysis. Although I’m primarily a theorist, I have some experience in advanced statistical methods that might have been useful in analysing the output. It would have been fun blogging about it too.

Unfortunately, however, none of that happened. Because of its budget crisis, and despite the fact that it had already spent a large amount (£4.5M) on Clover, the Science and Technology Facilities Council (STFC) decided to withdraw the funding needed to complete it (£2.5M) and cancel the experiment. I was very disappointed, but that’s nothing compared to Paolo (shown in the picture) who lost his job as a result of the decision and took his considerable skills and knowledge abroad.

We will never know for sure, but if Clover had gone ahead it might well have detected the same signal found five years later by BICEP2, which was announced in 2014. Working at three different frequencies (95, 150 and 225GHz) Clover would have had a better capability than BICEP2 in distinguishing the primordial signal from contamination from Galactic dust emission (which, as we now know, is the dominant contribution to the BICEP2 result; see thread here), although that still wouldn’t have been easy because of sensitivity issues. As it turned out, the BICEP2 signal turned out to be a false alarm so, looking on the bright side, perhaps at least the members of the Clover team avoided making fools of themselves on TV!

P.S. Note also that I moved to Cardiff in mid-2007, so I had not spent 5 years working on the Clover project by the time it was cancelled as discussed in the newspaper article, but many of my Cardiff colleagues had.

The IKEA Universe

Posted in Biographical, The Universe and Stuff with tags , , , on January 29, 2018 by telescoper

I heard yesterday the sad news of Ingvar Kamprad, the founder of Swedish furniture chain IKEA.  People can be very snobbish about IKEA, but its emphasis on affordable design has been a boon for people on low incomes for many years. When I was an impoverished postdoc living in London I used it a lot, especially their Billy bookcases. I also have a very sturdy Omar in my bedroom…

I remember years ago  that while shopping in the IKEA at Neasden I discovered that they were running a competition, for which entrants had to complete the sentence:

I shop at IKEA because…

My entry completed it thus:

I shop at IKEA because it’s as cheap as fuck.

I didn’t win.

But I digress. Not many people are aware that IKEA also furnishes  important insights into modern cosmology, so I’ll try to explain here. I’ve blogged before about the current state of cosmology, but it’s probably a good idea to give a quick reminder before going any further. We have a standard cosmological model, known as the concordance cosmology, which accounts for most relevant observations in a pretty convincing way and is based on the idea that the Universe began with a Big Bang.  However, there are a few things about this model that are curious, to say the least.

First, there is the spatial geometry of the Universe. According to Einstein’s general theory of relativity, universes come in three basic shapes: closed, open and flat. These are illustrated to the right. The flat space has “normal” geometry in which the interior angles of a triangle add up to 180 degrees. In a closed space the sum of the angles is greater than 180 degrees, and  in an open space it is less. Of course the space we live in is three-dimensional but the pictures show two-dimensional surfaces.

But you get the idea.

The point is that the flat space is very special. The two curved spaces are much more general because they can be described by a parameter called their curvature which could in principle take any value (either positive for a closed space, or negative for an open space). In other words the sphere at the top could have any radius from very small (large curvature) to very large (small curvature). Likewise with the “saddle” representing an open space. The flat space must have exactly zero curvature. There are many ways to be curved, but only one way to be flat.

Yet, as near as dammit, our Universe appears to be flat. So why, with all the other options theoretically available to it, did the Universe decide to choose the most special one, which also happens in my opinion to be also the most boring?

Then there is the way the Universe is put together. In order to be flat there must be an exact balance between the energy contained in the expansion of the Universe (positive kinetic energy) and the energy involved in the gravitational interactions between everything in it (negative potential energy). In general relativity, you see, the curvature relates to the total amount of energy.

On the left you can see the breakdown of the various components involved in the standard model with the whole pie representing a flat Universe. You see there’s a vary strange mixture dominated by dark energy (which we don’t understand) and dark matter (which we don’t understand). The bit we understand a little bit better (because we can sometimes see it directly) is only 5% of the whole thing. The proportions do look very peculiar.

And then finally, there is the issue that I have ablogged about (here and there) previously, which is why the Universe appears to be a bit lop-sided and asymmetrical when we’d like it to be a bit more aesthetically pleasing.

All these curiosities are naturally accounted for in my New Theory of the Universe, which asserts that the Divine Creator actually bought  the entire Cosmos  in IKEA.

This hypothesis immediately explains why the Universe is flat. Absolutely everything in IKEA comes in flat packs. Curvature is not allowed.

But this is not the only success of my theory. When God got home He obviously opened the flat pack, found the instructions and read the dreaded words “EASY SELF-ASSEMBLY”. Even the omnipotent would struggle to follow the bizarre set of cartoons and diagrams that accompany even the simplest IKEA furniture. The result is therefore predictable: strange pieces that don’t seem to fit together, bits left over whose purpose is not at all clear, and an overall appearance that is not at all like one would have expected.

It’s clear  where the lop-sidedness comes in too. Probably some of the parts were left out so the whole thing isn’t  held together properly and is probably completely unstable. This sort of thing happens all the time with IKEA stuff. And why is it you can never find the right size Allen Key to sort it out?

So there you have it. My new Theory of the Universe. Some details need to be worked out, but it is as good an explanation of these issues as I have heard. I claim my Nobel Prize.

If anything will ever get me another trip to Sweden, this will…

Beyond Falsifiability: Normal Science in a Multiverse

Posted in The Universe and Stuff with tags , , , , , , on January 17, 2018 by telescoper

There’s a new paper on the arXiv by Sean Carroll called Beyond Falsifiability: Normal Science in a Multiverse. The abstract is:

Cosmological models that invoke a multiverse – a collection of unobservable regions of space where conditions are very different from the region around us – are controversial, on the grounds that unobservable phenomena shouldn’t play a crucial role in legitimate scientific theories. I argue that the way we evaluate multiverse models is precisely the same as the way we evaluate any other models, on the basis of abduction, Bayesian inference, and empirical success. There is no scientifically respectable way to do cosmology without taking into account different possibilities for what the universe might be like outside our horizon. Multiverse theories are utterly conventionally scientific, even if evaluating them can be difficult in practice.

I’ve added a link to `abduction’ lest you think it has something to do with aliens!

I haven’t had time to read all of it yet, but thought I’d share it here because it concerns a topic that surfaces on this blog from time to time. I’m not a fan the multiverse because (in my opinion) most of the arguments trotted out in its favour are based on very muddled thinking. On the other hand, I’ve never taken seriously any of the numerous critiques of the multiverse idea based on the Popperian criterion of falsifiability because (again, in my opinion) that falsifiability has very little to do with the way science operates.

Anyway, Sean’s papers are always interesting to read so do have a look if this topic interests you. And feel free to comment through the box below.

Crunch time for Dark Matter?

Posted in The Universe and Stuff with tags , , on January 9, 2018 by telescoper

Gratuitous picture of the cluster Abel 2218, showing numerous gravitational lensing arcs

I was reading through an article by Philip Ball in the Grauniad this morning about likely breakthroughs in science for the forthcoming year. One of the topics discussed therein was dark matter. Here’s an excerpt:

It’s been agreed for decades that the universe must contain large amounts of so-called dark matter – about five times as much as all the matter visible as stars, galaxies and dust. This dark matter appears to exert a gravitational tug while not interacting significantly with ordinary matter or light in other ways. But no one has any idea what it consists of. Experiments have been trying to detect it for years, but all have drawn a blank. The situation is becoming grave enough for some researchers to start taking more seriously suggestions that what looks like dark matter is in fact a consequence of something else – such as a new force that modifies the apparent effects of gravity. This year could prove to be crunch time for dark matter: how long do we persist in believing in something when there’s no direct evidence for it?

It’s a good question, though I have to say that there’s very little direct evidence for anything in cosmology: it’s mostly circumstantial, i.e. evidence that relies on an inference to connect it to a conclusion of fact…

Anyway, I thought it would be fun to do a totally unscientific poll of the sort that scientists find  fun to do, so here’s one. It’s actually quite hard to make this the topic of a simple question, because we know that there is ordinary (baryonic) matter that we can’t see, and there is known to be some non-baryonic dark matter in the form of a cosmic neutrino background. What the question below should be interpreted to mean, therefore, is  `is there a dominant component of non-baryonic dark matter in the Universe in the form of some as-yet undiscovered particle?’ or something like that.

For the record, I do think there is dark matter but less convinced that it is simple cold dark matter. On the other hand, I regard its existence as a working hypothesis rather than an article of faith and do not lose any sleep about the possibility of that hypothesis turning out to be wrong!

 

The Expanding Universe: An Introduction

Posted in The Universe and Stuff with tags , , on January 5, 2018 by telescoper

For those of you reading this blog who feel they need an up-to-date primer for the basics of modern cosmology without too much technical detail, I found a paper on the arXiv that might give you what you want. It’s over a hundred pages long but does not use much complicated mathematics but has some nice illustrations. The author is Markus Pössel; the abstract reads

An introduction to the physics and mathematics of the expanding universe, using no more than high-school level / undergraduate mathematics. Covered are the basics of scale factor expansion, the dynamics of the expanding universe, various distance concepts and the generalized redshift-luminosity relation, among other topics.

This paper focusses on the basics of the standard framework founded on general relativity, especially how cosmological distances are defined and measured, rather than on trendy modern topics like dark energy and the cosmic microwave background. I’d say any first-year physics student should be able to cope with it, but it’s not for someone who hasn’t learned calculus. On the other hand, it’s free to download so you don’t have much to lose by having a look!

You can download a PDF here.

A Python Toolkit for Cosmology

Posted in The Universe and Stuff with tags , , , , on December 14, 2017 by telescoper

The programming language Python has established itself as the industry standard for researchers in physics and astronomy (as well as the many other fields, including most of those covered by the Data Innovation Research Institute which employs me part-time). It has also become the standard vehicle for teaching coding skills to undergraduates in many disciplines. In fact it looks like the first module I will be teaching in Maynooth next term is in Computational Physics, and that will be delivered using Python too. It’s been a while since I last did any significant hands-on programming, so this will provide me with a good refresher. The best way to learn something well is to have to teach it to others!

But I digress. This morning I noticed a paper by Benedikt Diemer on the arXiv with the title COLOSSUS: A python toolkit for cosmology, large-scale structure, and dark matter halos. Here is the abstract:

This paper introduces Colossus, a public, open-source python package for calculations related to cosmology, the large-scale structure of matter in the universe, and the properties of dark matter halos. The code is designed to be fast and easy to use, with a coherent, well-documented user interface. The cosmology module implements FLRW cosmologies including curvature, relativistic species, and different dark energy equations of state, and provides fast computations of the linear matter power spectrum, variance, and correlation function. The large-scale structure module is concerned with the properties of peaks in Gaussian random fields and halos in a statistical sense, including their peak height, peak curvature, halo bias, and mass function. The halo module deals with spherical overdensity radii and masses, density profiles, concentration, and the splashback radius. To facilitate the rapid exploration of these quantities, Colossus implements about 40 different fitting functions from the literature. I discuss the core routines in detail, with a particular emphasis on their accuracy. Colossus is available at bitbucket.org/bdiemer/colossus.

The software can be downloaded here. It looks a very useful package that includes code to calculate many of the bits and pieces used by cosmologists working on the theory of large-scale structure and galaxy evolution. It is also, I hope, an example of a trend towards greater use of open-source software, for which I congratulate the author! I think this is an important part of the campaign to create truly open science, as I blogged about here.

An important aspect of the way science works is that when a given individual or group publishes a result, it should be possible for others to reproduce it (or not, as the case may be). At present, this can’t always be done. In my own field of astrophysics/cosmology, for example, results in traditional scientific papers are often based on very complicated analyses of large data sets. This is increasingly the case in other fields too. A basic problem obviously arises when data are not made public. Fortunately in astrophysics these days researchers are pretty good at sharing their data, although this hasn’t always been the case.

However, even allowing open access to data doesn’t always solve the reproducibility problem. Often extensive numerical codes are needed to process the measurements and extract meaningful output. Without access to these pipeline codes it is impossible for a third party to check the path from input to output without writing their own version assuming that there is sufficient information to do that in the first place. That researchers should publish their software as well as their results is quite a controversial suggestion, but I think it’s the best practice for science. There isn’t a uniform policy in astrophysics and cosmology, but I sense that quite a few people out there agree with me. Cosmological numerical simulations, for example, can be performed by anyone with a sufficiently big computer using GADGET the source codes of which are freely available. Likewise, for CMB analysis, there is the excellent CAMB code, which can be downloaded at will; this is in a long tradition of openly available numerical codes, including CMBFAST and HealPix.

I suspect some researchers might be reluctant to share the codes they have written because they feel they won’t get sufficient credit for work done using them. I don’t think this is true, as researchers are generally very appreciative of such openness and publications describing the corresponding codes are generously cited. In any case I don’t think it’s appropriate to withhold such programs from the wider community, which prevents them being either scrutinized or extended as well as being used to further scientific research. In other words excessively proprietorial attitudes to data analysis software are detrimental to the spirit of open science.

Anyway, my views aren’t guaranteed to be representative of the community, so I’d like to ask for a quick show of hands via a poll…

…and you are of course welcome to comment via the usual box.