## The Fractal Universe, Part 2

Posted in History, The Universe and Stuff with tags , , , , , , on June 27, 2014 by telescoper

Given the recent discussion in comments on this blog I thought I’d give a brief update on the issue of the scale of cosmic homogeneity; I’m going to repeat some of the things I said in a post earlier this week just to make sure that this discussion is reasonable self-contained.

Our standard cosmological model is based on the Cosmological Principle, which asserts that the Universe is, in a broad-brush sense, homogeneous (is the same in every place) and isotropic (looks the same in all directions). But the question that has troubled cosmologists for many years is what is meant by large scales? How broad does the broad brush have to be? A couple of presentations discussed the possibly worrying evidence for the presence of a local void, a large underdensity on scale of about 200 MPc which may influence our interpretation of cosmological results.

I blogged some time ago about that the idea that the Universe might have structure on all scales, as would be the case if it were described in terms of a fractal set characterized by a fractal dimension $D$. In a fractal set, the mean number of neighbours of a given galaxy within a spherical volume of radius $R$ is proportional to $R^D$. If galaxies are distributed uniformly (homogeneously) then $D = 3$, as the number of neighbours simply depends on the volume of the sphere, i.e. as $R^3$, and the average number-density of galaxies. A value of $D < 3$ indicates that the galaxies do not fill space in a homogeneous fashion: $D = 1$, for example, would indicate that galaxies were distributed in roughly linear structures (filaments); the mass of material distributed along a filament enclosed within a sphere grows linear with the radius of the sphere, i.e. as $R^1$, not as its volume; galaxies distributed in sheets would have $D=2$, and so on.

We know that $D \simeq 1.2$ on small scales (in cosmological terms, still several Megaparsecs), but the evidence for a turnover to $D=3$ has not been so strong, at least not until recently. It’s just just that measuring $D$ from a survey is actually rather tricky, but also that when we cosmologists adopt the Cosmological Principle we apply it not to the distribution of galaxies in space, but to space itself. We assume that space is homogeneous so that its geometry can be described by the Friedmann-Lemaitre-Robertson-Walker metric.

According to Einstein’s theory of general relativity, clumps in the matter distribution would cause distortions in the metric which are roughly related to fluctuations in the Newtonian gravitational potential $\delta\Phi$ by $\delta\Phi/c^2 \sim \left(\lambda/ct \right)^{2} \left(\delta \rho/\rho\right)$, give or take a factor of a few, so that a large fluctuation in the density of matter wouldn’t necessarily cause a large fluctuation of the metric unless it were on a scale $\lambda$ reasonably large relative to the cosmological horizon $\sim ct$. Galaxies correspond to a large $\delta \rho/\rho \sim 10^6$ but don’t violate the Cosmological Principle because they are too small in scale $\lambda$ to perturb the background metric significantly.

In my previous post I left the story as it stood about 15 years ago, and there have been numerous developments since then, some convincing (to me) and some not. Here I’ll just give a couple of key results, which I think to be important because they address a specific quantifiable question rather than relying on qualitative and subjective interpretations.

The first, which is from a paper I wrote with my (then) PhD student Jun Pan, demonstrated what I think is the first convincing demonstration that the correlation dimension of galaxies in the IRAS PSCz survey does turn over to the homogeneous value $D=3$ on large scales:

You can see quite clearly that there is a gradual transition to homogeneity beyond about 10 Mpc, and this transition is certainly complete before 100 Mpc. The PSCz survey comprises “only” about 11,000 galaxies, and it relatively shallow too (with a depth of about 150 Mpc),  but has an enormous advantage in that it covers virtually the whole sky. This is important because it means that the survey geometry does not have a significant effect on the results. This is important because it does not assume homogeneity at the start. In a traditional correlation function analysis the number of pairs of galaxies with a given separation is compared with a random distribution with the same mean number of galaxies per unit volume. The mean density however has to be estimated from the same survey as the correlation function is being calculated from, and if there is large-scale clustering beyond the size of the survey this estimate will not be a fair estimate of the global value. Such analyses therefore assume what they set out to prove. Ours does not beg the question in this way.

The PSCz survey is relatively sparse but more recently much bigger surveys involving optically selected galaxies have confirmed this idea with great precision. A particular important recent result came from the WiggleZ survey (in a paper by Scrimgeour et al. 2012). This survey is big enough to look at the correlation dimension not just locally (as we did with PSCz) but as a function of redshift, so we can see how it evolves. In fact the survey contains about 200,000 galaxies in a volume of about a cubic Gigaparsec. Here are the crucial graphs:

I think this proves beyond any reasonable doubt that there is a transition to homogeneity at about 80 Mpc, well within the survey volume. My conclusion from this and other studies is that the structure is roughly self-similar on small scales, but this scaling gradually dissolves into homogeneity. In a Fractal Universe the correlation dimension would not depend on scale, so what I’m saying is that we do not live in a fractal Universe. End of story.

## The Importance of Being Homogeneous

Posted in The Universe and Stuff with tags , , , , , , , , on August 29, 2012 by telescoper

A recent article in New Scientist reminded me that I never completed the story I started with a couple of earlier posts (here and there), so while I wait for the rain to stop I thought I’d make myself useful by posting something now. It’s all about a paper available on the arXiv by Scrimgeour et al. concerning the transition to homogeneity of galaxy clustering in the WiggleZ galaxy survey, the abstract of which reads:

We have made the largest-volume measurement to date of the transition to large-scale homogeneity in the distribution of galaxies. We use the WiggleZ survey, a spectroscopic survey of over 200,000 blue galaxies in a cosmic volume of ~1 (Gpc/h)^3. A new method of defining the ‘homogeneity scale’ is presented, which is more robust than methods previously used in the literature, and which can be easily compared between different surveys. Due to the large cosmic depth of WiggleZ (up to z=1) we are able to make the first measurement of the transition to homogeneity over a range of cosmic epochs. The mean number of galaxies N(<r) in spheres of comoving radius r is proportional to r^3 within 1%, or equivalently the fractal dimension of the sample is within 1% of D_2=3, at radii larger than 71 \pm 8 Mpc/h at z~0.2, 70 \pm 5 Mpc/h at z~0.4, 81 \pm 5 Mpc/h at z~0.6, and 75 \pm 4 Mpc/h at z~0.8. We demonstrate the robustness of our results against selection function effects, using a LCDM N-body simulation and a suite of inhomogeneous fractal distributions. The results are in excellent agreement with both the LCDM N-body simulation and an analytical LCDM prediction. We can exclude a fractal distribution with fractal dimension below D_2=2.97 on scales from ~80 Mpc/h up to the largest scales probed by our measurement, ~300 Mpc/h, at 99.99% confidence.

To paraphrase, the conclusion of this study is that while galaxies are strongly clustered on small scales – in a complex `cosmic web’ of clumps, knots, sheets and filaments –  on sufficiently large scales, the Universe appears to be smooth. This is much like a bowl of porridge which contains many lumps, but (usually) none as large as the bowl it’s put in.

Our standard cosmological model is based on the Cosmological Principle, which asserts that the Universe is, in a broad-brush sense, homogeneous (is the same in every place) and isotropic (looks the same in all directions). But the question that has troubled cosmologists for many years is what is meant by large scales? How broad does the broad brush have to be?

I blogged some time ago about that the idea that the  Universe might have structure on all scales, as would be the case if it were described in terms of a fractal set characterized by a fractal dimension $D$. In a fractal set, the mean number of neighbours of a given galaxy within a spherical volume of radius $R$ is proportional to $R^D$. If galaxies are distributed uniformly (homogeneously) then $D = 3$, as the number of neighbours simply depends on the volume of the sphere, i.e. as $R^3$, and the average number-density of galaxies. A value of $D < 3$ indicates that the galaxies do not fill space in a homogeneous fashion: $D = 1$, for example, would indicate that galaxies were distributed in roughly linear structures (filaments); the mass of material distributed along a filament enclosed within a sphere grows linear with the radius of the sphere, i.e. as $R^1$, not as its volume; galaxies distributed in sheets would have $D=2$, and so on.

We know that $D \simeq 1.2$ on small scales (in cosmological terms, still several Megaparsecs), but the evidence for a turnover to $D=3$ has not been so strong, at least not until recently. It’s just just that measuring $D$ from a survey is actually rather tricky, but also that when we cosmologists adopt the Cosmological Principle we apply it not to the distribution of galaxies in space, but to space itself. We assume that space is homogeneous so that its geometry can be described by the Friedmann-Lemaitre-Robertson-Walker metric.

According to Einstein’s  theory of general relativity, clumps in the matter distribution would cause distortions in the metric which are roughly related to fluctuations in the Newtonian gravitational potential $\delta\Phi$ by $\delta\Phi/c^2 \sim \left(\lambda/ct \right)^{2} \left(\delta \rho/\rho\right)$, give or take a factor of a few, so that a large fluctuation in the density of matter wouldn’t necessarily cause a large fluctuation of the metric unless it were on a scale $\lambda$ reasonably large relative to the cosmological horizon $\sim ct$. Galaxies correspond to a large $\delta \rho/\rho \sim 10^6$ but don’t violate the Cosmological Principle because they are too small in scale $\lambda$ to perturb the background metric significantly.

The discussion of a fractal universe is one I’m overdue to return to. In my previous post  I left the story as it stood about 15 years ago, and there have been numerous developments since then, not all of them consistent with each other. I will do a full “Part 2” to that post eventually, but in the mean time I’ll just comment that this particularly one does seem to be consistent with a Universe that possesses the property of large-scale homogeneity. If that conclusion survives the next generation of even larger galaxy redshift surveys then it will come as an immense relief to cosmologists.

The reason for that is that the equations of general relativity are very hard to solve in cases where there isn’t a lot of symmetry; there are just too many equations to solve for a general solution to be obtained.  If the cosmological principle applies, however, the equations simplify enormously (both in number and form) and we can get results we can work with on the back of an envelope. Small fluctuations about the smooth background solution can be handled (approximately but robustly) using a technique called perturbation theory. If the fluctuations are large, however, these methods don’t work. What we need to do instead is construct exact inhomogeneous model, and that is very very hard. It’s of course a different question as to why the Universe is so smooth on large scales, but as a working cosmologist the real importance of it being that way is that it makes our job so much easier than it would otherwise be.

P.S. And I might add that the importance of the Scrimgeour et al paper to me personally is greatly amplified by the fact that it cites a number of my own articles on this theme!

## Dark Energy is Real. Really?

Posted in Astrohype, The Universe and Stuff with tags , , , , , on May 20, 2011 by telescoper

I don’t have much time to post today after spending all morning in a meeting about Assuring a Quality Experience in the Graduate College and in between reading project reports this afternoon.

However, I couldn’t resist a quickie just to draw your attention to a cosmology story that’s made it into the mass media, e.g. BBC Science. This concerns the recent publication of a couple of papers from the WiggleZ Dark Energy Survey which has used the Anglo-Australian Telescope. You can read a nice description of what WiggleZ (pronounced “Wiggle-Zee”) is all about here, but in essence it involves making two different sorts of measurements of how galaxies cluster in order to constrain the Universe’s geometry and dynamics. The first method is the “wiggle” bit, in that it depends on the imprint of baryon acoustic oscillations in the power-spectrum of galaxy clustering. The other involves analysing the peculiar motions of the galaxies by measuring the distortion of the clustering pattern introduced seen in redshift space; redshifts are usually denoted z in cosmology so that accounts for the “zee”.

The paper describing the results from the former method can be found here, while the second technique is described there.

This survey has been a major effort by an extensive team of astronomers: it has involved spectroscopic measurements of almost a quarter of a million galaxies, spread over 1000 square degrees on the sky, and has taken almost five years to complete. The results are consistent with the standard ΛCDM cosmological model, and in particular with the existence of the  dark energy that this model implies, but which we don’t have a theoretical explanation for.

This is all excellent stuff and it obviously lends further observational support to the standard model. However, I’m not sure I agree with the headline of press release put out by the WiggleZ team  Dark Energy is Real. I certainly agree that dark energy is a plausible explanation for a host of relevant observations, but do we really know for sure that it is “real”? Can we really be sure that there is no other explanation?  Wiggle Z has certainly produced evidence that’s sufficient to rule out some alternative models, but that’s not the same as proof.  I worry when scientists speak like this, with what sounds like certainty, about things that are far from proven. Just because nobody has thought of an alternative explanation doesn’t mean that none exists.

The problem is that a press release entitled “dark energy is real” is much more likely to be picked up by a newspaper radio or TV editor than one that says “dark energy remains best explanation”….