Archive for Hector Socas-Navarro

The Negative Mass Bug

Posted in Astrohype, Open Access, The Universe and Stuff with tags , , , , , on February 25, 2019 by telescoper

You may have noticed that some time ago I posted about  a paper by Jamie Farnes published in Astronomy & Astrophysics but available on the arXiv here which entails a suggestion that material with negative mass might account for dark energy and/or dark matter.

Here is the abstract of said paper:

Dark energy and dark matter constitute 95% of the observable Universe. Yet the physical nature of these two phenomena remains a mystery. Einstein suggested a long-forgotten solution: gravitationally repulsive negative masses, which drive cosmic expansion and cannot coalesce into light-emitting structures. However, contemporary cosmological results are derived upon the reasonable assumption that the Universe only contains positive masses. By reconsidering this assumption, I have constructed a toy model which suggests that both dark phenomena can be unified into a single negative mass fluid. The model is a modified ΛCDM cosmology, and indicates that continuously-created negative masses can resemble the cosmological constant and can flatten the rotation curves of galaxies. The model leads to a cyclic universe with a time-variable Hubble parameter, potentially providing compatibility with the current tension that is emerging in cosmological measurements. In the first three-dimensional N-body simulations of negative mass matter in the scientific literature, this exotic material naturally forms haloes around galaxies that extend to several galactic radii. These haloes are not cuspy. The proposed cosmological model is therefore able to predict the observed distribution of dark matter in galaxies from first principles. The model makes several testable predictions and seems to have the potential to be consistent with observational evidence from distant supernovae, the cosmic microwave background, and galaxy clusters. These findings may imply that negative masses are a real and physical aspect of our Universe, or alternatively may imply the existence of a superseding theory that in some limit can be modelled by effective negative masses. Both cases lead to the surprising conclusion that the compelling puzzle of the dark Universe may have been due to a simple sign error.

Well there’s a new paper just out on the arXiv by Hector Socas-Navarro with the abstract

A recent work by Farnes (2018) proposed an alternative cosmological model in which both dark matter and dark energy are replaced with a single fluid of negative mass. This paper presents a critical review of that model. A number of problems and discrepancies with observations are identified. For instance, the predicted shape and density of galactic dark matter halos are incorrect. Also, halos would need to be less massive than the baryonic component or they would become gravitationally unstable. Perhaps the most challenging problem in this theory is the presence of a large-scale version of the `runaway’ effect, which would result in all galaxies moving in random directions at nearly the speed of light. Other more general issues regarding negative mass in general relativity are discussed, such as the possibility of time-travel paradoxes.

Among other things there is this:

After initially struggling to reproduce the F18 results, a careful inspection of his source code revealed a subtle bug in the computation of the gravitational acceleration. Unfortunately, the simulations in F18 are seriously compromised by this coding error whose effect is that the gravitational force decreases with the inverse of the distance, instead of the distance squared.

Oh dear.

I don’t think I need go any further into this particular case, which would just rub salt into the wounds of Farnes (2018) but I will make a general comment. Peer review is the best form of quality stamp that we have but, as this case demonstrates, it is by no means flawless. The paper by Farnes (2018) was refereed and published, but is now shown to be wrong*. Just as authors can make mistakes so can referees. I know I’ve screwed up as a referee in the past so I’m not claiming to be better than anyone in saying this.

*This claim is contested: see the comment below.

I don’t think the lesson is that we should just scrap peer review, but I do think we need to be more imaginative about how it is used than just relying on one or two individuals to do it. This case shows that science eventually works, as the error was found and corrected, but that was only possible because the code used by Farnes (2018) was made available for scrutiny. This is not always what happens. I take this as a vindication of open science, and an example of why scientists should share their code and data to enable others to check the results. I’d like to see a system in which papers are not regarded as `final’ documents but things which can be continuously modified in response to independent scrutiny, but that would require a major upheaval in academic practice and is unlikely to happen any time soon.

In this case, in the time since publication there has been a large amount of hype about the Farnes (2018) paper, and it’s unlikely that any of the media who carried stories about the results therein will ever publish retractions. This episode does therefore illustrate the potentially damaging effect on public trust that the excessive thirst for publicity can have. So how do we balance open science against the likelihood that wrong results will be taken up by the media before the errors are found? I wish I knew!