Archive for Fractals

The Fractal Universe, Part 2

Posted in History, The Universe and Stuff with tags , , , , , , on June 27, 2014 by telescoper

Given the recent discussion in comments on this blog I thought I’d give a brief update on the issue of the scale of cosmic homogeneity; I’m going to repeat some of the things I said in a post earlier this week just to make sure that this discussion is reasonable self-contained.

Our standard cosmological model is based on the Cosmological Principle, which asserts that the Universe is, in a broad-brush sense, homogeneous (is the same in every place) and isotropic (looks the same in all directions). But the question that has troubled cosmologists for many years is what is meant by large scales? How broad does the broad brush have to be? A couple of presentations discussed the possibly worrying evidence for the presence of a local void, a large underdensity on scale of about 200 MPc which may influence our interpretation of cosmological results.

I blogged some time ago about that the idea that the Universe might have structure on all scales, as would be the case if it were described in terms of a fractal set characterized by a fractal dimension D. In a fractal set, the mean number of neighbours of a given galaxy within a spherical volume of radius R is proportional to R^D. If galaxies are distributed uniformly (homogeneously) then D = 3, as the number of neighbours simply depends on the volume of the sphere, i.e. as R^3, and the average number-density of galaxies. A value of D < 3 indicates that the galaxies do not fill space in a homogeneous fashion: D = 1, for example, would indicate that galaxies were distributed in roughly linear structures (filaments); the mass of material distributed along a filament enclosed within a sphere grows linear with the radius of the sphere, i.e. as R^1, not as its volume; galaxies distributed in sheets would have D=2, and so on.

We know that D \simeq 1.2 on small scales (in cosmological terms, still several Megaparsecs), but the evidence for a turnover to D=3 has not been so strong, at least not until recently. It’s just just that measuring D from a survey is actually rather tricky, but also that when we cosmologists adopt the Cosmological Principle we apply it not to the distribution of galaxies in space, but to space itself. We assume that space is homogeneous so that its geometry can be described by the Friedmann-Lemaitre-Robertson-Walker metric.

According to Einstein’s theory of general relativity, clumps in the matter distribution would cause distortions in the metric which are roughly related to fluctuations in the Newtonian gravitational potential \delta\Phi by \delta\Phi/c^2 \sim \left(\lambda/ct \right)^{2} \left(\delta \rho/\rho\right), give or take a factor of a few, so that a large fluctuation in the density of matter wouldn’t necessarily cause a large fluctuation of the metric unless it were on a scale \lambda reasonably large relative to the cosmological horizon \sim ct. Galaxies correspond to a large \delta \rho/\rho \sim 10^6 but don’t violate the Cosmological Principle because they are too small in scale \lambda to perturb the background metric significantly.

In my previous post I left the story as it stood about 15 years ago, and there have been numerous developments since then, some convincing (to me) and some not. Here I’ll just give a couple of key results, which I think to be important because they address a specific quantifiable question rather than relying on qualitative and subjective interpretations.

The first, which is from a paper I wrote with my (then) PhD student Jun Pan, demonstrated what I think is the first convincing demonstration that the correlation dimension of galaxies in the IRAS PSCz survey does turn over to the homogeneous value D=3 on large scales:

correlations

You can see quite clearly that there is a gradual transition to homogeneity beyond about 10 Mpc, and this transition is certainly complete before 100 Mpc. The PSCz survey comprises “only” about 11,000 galaxies, and it relatively shallow too (with a depth of about 150 Mpc),  but has an enormous advantage in that it covers virtually the whole sky. This is important because it means that the survey geometry does not have a significant effect on the results. This is important because it does not assume homogeneity at the start. In a traditional correlation function analysis the number of pairs of galaxies with a given separation is compared with a random distribution with the same mean number of galaxies per unit volume. The mean density however has to be estimated from the same survey as the correlation function is being calculated from, and if there is large-scale clustering beyond the size of the survey this estimate will not be a fair estimate of the global value. Such analyses therefore assume what they set out to prove. Ours does not beg the question in this way.

The PSCz survey is relatively sparse but more recently much bigger surveys involving optically selected galaxies have confirmed this idea with great precision. A particular important recent result came from the WiggleZ survey (in a paper by Scrimgeour et al. 2012). This survey is big enough to look at the correlation dimension not just locally (as we did with PSCz) but as a function of redshift, so we can see how it evolves. In fact the survey contains about 200,000 galaxies in a volume of about a cubic Gigaparsec. Here are the crucial graphs:

homogeneity

I think this proves beyond any reasonable doubt that there is a transition to homogeneity at about 80 Mpc, well within the survey volume. My conclusion from this and other studies is that the structure is roughly self-similar on small scales, but this scaling gradually dissolves into homogeneity. In a Fractal Universe the correlation dimension would not depend on scale, so what I’m saying is that we do not live in a fractal Universe. End of story.

Advertisements

The Father of Fractals

Posted in The Universe and Stuff with tags , on October 16, 2010 by telescoper

Just a brief post to pass on the sad news of the death at the age of 86 of Benoit Mandelbrot. Mandelbrot is credited with having invented the term fractal to describe objects that possess the property of self-similarity and which have structure on arbitrarily small scales. In his marvellous book, The Fractal Geometry of Nature, Mandelbrot explored the use of fractals to describe natural objects and phenomena as diverse as clouds, mountain ranges, lightning bolts, coastlines, snow flakes, plants, and animal coloration patterns. His ideas found application across the whole spectrum of physics and astrophysics including, controversially, cosmology. Fractal images, such as the one below of the Mandelbrot set, also found their way into popular culture; I had a poster of one on my bedroom wall when I was a student and kept it for many years thereafter.

I came across Mandelbrot’s book in the public library and found it truly inspirational, so much so that he became a scientific hero of mine. I was therefore thrilled at the prospect of meeting him when I myself had become a scientist and had the chance to go to a conference, in Paris, at which he was speaking. Unfortunately, I was deeply disappointed by his lecture, which was truly awful, and his personal manner, which I found less than congenial. Nevertheless, there’s no denying his immense contributions to mathematics and science nor his wider impact on culture and society. Another one of the greats has left us.


Share/Bookmark

The Fractal Universe, Part 1

Posted in The Universe and Stuff with tags , , , , on August 4, 2010 by telescoper

A long time ago I blogged about the Cosmic Web and one of the comments there suggested I write something about the idea that the large-scale structure of the Universe might be some sort of fractal.  There’s a small (but vocal) group of cosmologists who favour fractal cosmological models over the more orthodox cosmology favoured by the majority, so it’s definitely something worth writing about. I have been meaning to post something about it for some time now, but it’s too big and technical a matter to cover in one item. I’ve therefore decided to start by posting a slightly edited version of a short News and Views piece I wrote about the  question in 1998. It’s very out of date on the observational side, but I thought it would be good to set the scene for later developments (mentioned in the last paragraph), which I hope to cover in future posts.

—0—

One of the central tenets of cosmological orthodoxy is the Cosmological Principle, which states that, in a broad-brush sense, the Universe is the same in every place and in every direction. This assumption has enabled cosmologists to obtain relatively simple solutions of Einstein’s General Theory of Relativity that describe the dynamical behaviour of the Universe as a whole. These solutions, called the Friedmann models [1], form the basis of the Big Bang theory. But is the Cosmological Principle true? Not according to Francesco Sylos-Labini et al. [2], who argue, controversially, that the Universe is not uniform at all, but has a never-ending hierarchical structure in which galaxies group together in clusters which, in turn, group together in superclusters, and so on.

These claims are completely at odds with the Cosmological Principle and therefore with the Friedmann models and the entire Big Bang theory. The central thrust of the work of Sylos-Labini et al. is that the statistical methods used by cosmologists to analyse galaxy clustering data are inappropriate because they assume the property of large-scale homogeneity at the outset. If one does not wish to assume this then one must use different methods.

What they do is to assume that the Universe is better described in terms of a fractal set characterized by a fractal dimension D. In a fractal set, the mean number of neighbours of a given galaxy within a volume of radius R is proportional to RD. If galaxies are distributed uniformly then D = 3, as the number of neighbours simply depends on the volume of the sphere, i.e. as R3 and the average number-density of galaxies. A value of D < 3 indicates that the galaxies do not fill space in a homogeneous fashion: D = 1, for example, would indicate that galaxies were distributed in roughly linear structures (filaments); the mass of material distributed along a filament enclosed within a sphere grows linear with the radius of the sphere, i.e. as R1, not as its volume.  Sylos-Labini et al. argue that D = 2, which suggests a roughly planar (sheet-like) distribution of galaxies.

Most cosmologists would accept that the distribution of galaxies on relatively small scales, up to perhaps a few tens of megaparsecs (Mpc), can indeed be described in terms of a fractal model.This small-scale clustering is expected to be dominated by purely gravitational physics, and gravity has no particular length scale associated with it. But standard theory requires that the fractal dimension should approach the homogeneous value D = 3 on large enough scales. According to standard models of cosmological structure formation, this transition should occur on scales of a few hundred Mpc.

The main source of the controversy is that most available three-dimensional maps of galaxy positions are not large enough to encompass the expected transition to homogeneity. Distances must be inferred from redshifts, and it is difficult to construct these maps from redshift surveys, which require spectroscopic studies of large numbers of galaxies.

Sylos-Labini et al. have analysed a number of redshift surveys, including the largest so far available, the Las Campanas Redshift Survey [3]; see below. They find D = 2 for all the data they look at, and argue that there is no transition to homogeneity for scales up to 4,000 Mpc, way beyond the expected turnover. If this were true, it would indeed be bad news for the orthodox among us.

The survey maps the Universe out to recession velocities of 60,000 km s-1, corresponding to distances of a few hundred million parsecs. Although no fractal structure on the largest scales is apparent (there are no clear voids or concentrations on the same scale as the whole map), one statistical analysis [2] finds a fractal dimension of two in this and other surveys, for all scales – conflicting with a basic principle of cosmology.

Their results are, however, at variance with the visual appearance of the Las Campanas survey, for example, which certainly seems to display large-scale homogeneity. Objections to these claims have been lodged by Luigi Guzzo [4], for instance, who has criticized their handling of the data and has presented independent results that appear to be consistent with a transition to homogeneity. It is also true that Sylos-Labini et al. have done their cause no good by basing some conclusions on a heterogeneous compilation of redshifts called the LEDA database [5], which is not a controlled sample and so is completely unsuitable for this kind of study. Finally, it seems clear that they have substantially overestimated the effective depth of the catalogues they are using. But although their claims remain controversial, the consistency of the results obtained by Sylos-Labini et al. is impressive enough to raise doubts about the standard picture.

Mainstream cosmologists are not yet so worried as to abandon the Cosmological Principle. Most are probably quite happy to admit that there is no overwhelming direct evidence in favour of global uniformity from current three-dimensional galaxy catalogues, which are in any case relatively shallow. But this does not mean there is no evidence at all: the near-isotropy of the sky temperature of the cosmic microwave background, the uniformity of the cosmic X-ray background, and the properties of source counts are all difficult to explain unless the Universe is homogeneous on large scales [6]. Moreover, Hubble’s law itself is a consequence of large-scale homogeneity: if the Universe were inhomogeneous one would not expect to see a uniform expansion, but an irregular pattern of velocities resulting from large-scale density fluctuations.

But above all, it is the principle of Occam’s razor that guides us: in the absence of clear evidence against it, the simplest model compatible with the data is to be preferred. Several observational projects are already under way, including the Sloan Digital Sky Survey and the Anglo-Australian 2DF Galaxy Redshift Survey, that should chart the spatial distribution of galaxies in enough detail to provide an unambiguous answer to the question of large-scale cosmic uniformity. In the meantime, and in the absence of clear evidence against it, the Cosmological Principle remains an essential part of the Big Bang theory.

References

  1. Friedmann, A. Z. Phys. 10, 377–386 ( 1922).
  2. Sylos-Labini, F., Montuori, M. & Pietronero, L. Phys. Rep. 293, 61-226 .
  3. Shectman, S.et al. Astrophys. J. 470, 172–188 (1996).
  4. Guzzo, L. New Astron. 2, 517–532 ( 1997).
  5. Paturel, G. et al. in Information and Online Data in Astronomy (eds Egret, D. & Albrecht, M.) 115 (Kluwer, Dordrecht,1995).
  6. Peebles, P. J. E. Principles of Physical Cosmology (Princeton Univ. Press, NJ, 1993).