Archive for arXiv:1803.05445

Chaos and Variance in (Simulations of) Galaxy Formation

Posted in The Universe and Stuff with tags , , , on September 11, 2019 by telescoper

During yesterday’s viva voce examination a paper came up that I missed when it came out last year. It’s by Keller et al. called Chaos and Variance in Galaxy Formation. The abstract reads:

The evolution of galaxies is governed by equations with chaotic solutions: gravity and compressible hydrodynamics. While this micro-scale chaos and stochasticity has been well studied, it is poorly understood how it couples to macro-scale properties examined in simulations of galaxy formation. In this paper, we show how perturbations introduced by floating-point roundoff, random number generators, and seemingly trivial differences in algorithmic behaviour can produce non-trivial differences in star formation histories, circumgalactic medium (CGM) properties, and the distribution of stellar mass. We examine the importance of stochasticity due to discreteness noise, variations in merger timings and how self-regulation moderates the effects of this stochasticity. We show that chaotic variations in stellar mass can grow until halted by feedback-driven self-regulation or gas exhaustion. We also find that galaxy mergers are critical points from which large (as much as a factor of 2) variations in quantities such as the galaxy stellar mass can grow. These variations can grow and persist for more than a Gyr before regressing towards the mean. These results show that detailed comparisons of simulations require serious consideration of the magnitude of effects compared to run-to-run chaotic variation, and may significantly complicate interpreting the impact of different physical models. Understanding the results of simulations requires us to understand that the process of simulation is not a mapping of an infinitesimal point in configuration space to another, final infinitesimal point. Instead, simulations map a point in a space of possible initial conditions points to a volume of possible final states.

(The highlighting is mine.) I find this analysis pretty scary, actually, as it shows that numerical effects (including just running the code on different processors) can have an enormous impact on the outputs of these simulations. Here’s Figure 14 for example:

This shows the predicted stellar surface mass density in a number of simulations: the outputs vary by more than an order of magnitude!

This paper underlines an important question which I have worried about before, and could paraphrase as “Do we trust N-body simulations too much?”. The use of numerical codes in cosmology is widespread and there’s no question that they have driven the subject forward in many ways, not least because they can generate “mock” galaxy catalogues in order to help plan survey strategies. However, I’ve always been concerned that there is a tendency to trust these calculations too much. On the one hand there’s the question of small-scale resolution and on the other there’s the finite size of the computational volume. And there are other complications in between too. In other words, simulations are approximate. To some extent our ability to extract information from surveys will therefore be limited by the inaccuracy of our calculation of the theoretical predictions.

Anyway, the paper gives us quite a few things to think about and I think it might provoke a bit of discussion, which is why I mentioned it here – i.e. to encourage folk to read and give their opinions.

The use of the word “simulation” always makes me smile. Being a crossword nut I spend far too much time looking in dictionaries but one often finds quite amusing things there. This is how the Oxford English Dictionary defines SIMULATION:


a. The action or practice of simulating, with intent to deceive; false pretence, deceitful profession.

b. Tendency to assume a form resembling that of something else; unconscious imitation.

2. A false assumption or display, a surface resemblance or imitation, of something.

3. The technique of imitating the behaviour of some situation or process (whether economic, military, mechanical, etc.) by means of a suitably analogous situation or apparatus, esp. for the purpose of study or personnel training.

So it’s only the third entry that gives the intended meaning. This is worth bearing in mind if you prefer old-fashioned analytical theory!

In football, of course, you can even get sent off for simulation…