Cosmic Clumpiness Conundra

Well there’s a coincidence. I was just thinking of doing a post about cosmological homogeneity, spurred on by a discussion at the workshop I attended in Copenhagen a couple of weeks ago, when suddenly I’m presented with a topical hook to hang it on.

New Scientist has just carried a report about a paper by Shaun Thomas and colleagues from University College London the abstract of which reads

We observe a large excess of power in the statistical clustering of luminous red galaxies in the photometric SDSS galaxy sample called MegaZ DR7. This is seen over the lowest multipoles in the angular power spectra Cℓ in four equally spaced redshift bins between 0.4 \leq z \leq 0.65. However, it is most prominent in the highest redshift band at z\sim 4\sigma and it emerges at an effective scale k \sim 0.01 h{\rm Mpc}^{-1}. Given that MegaZ DR7 is the largest cosmic volume galaxy survey to date (3.3({\rm Gpc} h^{-1})^3) this implies an anomaly on the largest physical scales probed by galaxies. Alternatively, this signature could be a consequence of it appearing at the most systematically susceptible redshift. There are several explanations for this excess power that range from systematics to new physics. We test the survey, data, and excess power, as well as possible origins.

To paraphrase, it means that the distribution of galaxies in the survey they study is clumpier than expected on very large scales. In fact the level of fluctuation is about a factor two higher than expected on the basis of the standard cosmological model. This shows that either there’s something wrong with the standard cosmological model or there’s something wrong with the survey. Being a skeptic at heart, I’d bet on the latter if I had to put my money somewhere, because this survey involves photometric determinations of redshifts rather than the more accurate and reliable spectroscopic variety. I won’t be getting too excited about this result unless and until it is confirmed with a full spectroscopic survey. But that’s not to say it isn’t an interesting result.

For one thing it keeps alive a debate about whether, and at what scale, the Universe is homogeneous. The standard cosmological model is based on the Cosmological Principle, which asserts that the Universe is, in a broad-brush sense, homogeneous (is the same in every place) and isotropic (looks the same in all directions). But the question that has troubled cosmologists for many years is what is meant by large scales? How broad does the broad brush have to be?

At our meeting a few weeks ago, Subir Sarkar from Oxford pointed out that the evidence for cosmological homogeneity isn’t as compelling as most people assume. I blogged some time ago about an alternative idea, that the Universe might have structure on all scales, as would be the case if it were described in terms of a fractal set characterized by a fractal dimension D. In a fractal set, the mean number of neighbours of a given galaxy within a spherical volume of radius R is proportional to R^D. If galaxies are distributed uniformly (homogeneously) then D = 3, as the number of neighbours simply depends on the volume of the sphere, i.e. as R^3, and the average number-density of galaxies. A value of D < 3 indicates that the galaxies do not fill space in a homogeneous fashion: D = 1, for example, would indicate that galaxies were distributed in roughly linear structures (filaments); the mass of material distributed along a filament enclosed within a sphere grows linear with the radius of the sphere, i.e. as R^1, not as its volume; galaxies distributed in sheets would have D=2, and so on.

The discussion of a fractal universe is one I’m overdue to return to. In my previous post  I left the story as it stood about 15 years ago, and there have been numerous developments since then. I will do a “Part 2” to that post before long, but I’m waiting for some results I’ve heard about informally, but which aren’t yet published, before filling in the more recent developments.

We know that D \simeq 1.2 on small scales (in cosmological terms, still several Megaparsecs), but the evidence for a turnover to D=3 is not so strong. The point is, however, at what scale would we say that homogeneity is reached. Not when D=3 exactly, because there will always be statistical fluctuations; see below. What scale, then?  Where D=2.9? D=2.99?

What I’m trying to say is that much of the discussion of this issue involves the phrase “scale of homogeneity” when that is a poorly defined concept. There is no such thing as “the scale of homogeneity”, just a whole host of quantities that vary with scale in a way that may or may not approach the value expected in a homogeneous universe.

It’s even more complicated than that, actually. When we cosmologists adopt the Cosmological Principle we apply it not to the distribution of galaxies in space, but to space itself. We assume that space is homogeneous so that its geometry can be described by the Friedmann-Lemaitre-Robertson-Walker metric.

According to Einstein’s  theory of general relativity, clumps in the matter distribution would cause distortions in the metric which are roughly related to fluctuations in the Newtonian gravitational potential \delta\Phi by \delta\Phi/c^2 \sim \left(\lambda/ct \right)^{2} \left(\delta \rho/\rho\right), give or take a factor of a few, so that a large fluctuation in the density of matter wouldn’t necessarily cause a large fluctuation of the metric unless it were on a scale \lambda reasonably large relative to the cosmological horizon \sim ct. Galaxies correspond to a large \delta \rho/\rho \sim 10^6 but don’t violate the Cosmological Principle because they are too small to perturb the background metric significantly. Even the big clumps found by the UCL team only correspond to a small variation in the metric. The issue with these, therefore, is not so much that they threaten the applicability of the Cosmological Principle, but that they seem to suggest structure might have grown in a different way to that usually supposed.

The problem is that we can’t measure the gravitational potential on these scales directly so our tests are indirect. Counting galaxies is relatively crude because we don’t even know how well galaxies trace the underlying mass distribution.

An alternative way of doing this is to use not the positions of galaxies, but their velocities (usually called peculiar motions). These deviations from a pure Hubble flow are caused by lumps of matter pulling on the galaxies; the more lumpy the Universe is, the larger the velocities are and the larger the lumps are the more coherent the flow becomes. On small scales galaxies whizz around at speeds of hundreds of kilometres per second relative to each other, but averaged over larger and larger volumes the bulk flow should get smaller and smaller, eventually coming to zero in a frame in which the Universe is exactly homogeneous and isotropic.

Roughly speaking the bulk flow v should relate to the metric fluctuation as approximately \delta \Phi/c^2 \sim \left(\lambda/ct \right) \left(v/c\right).

It has been claimed that some observations suggest the existence of a dark flow which, if true, would challenge the reliability of the standard cosmological framework, but these results are controversial and are yet to be independently confirmed.

But suppose you could measure the net flow of matter in spheres of increasing size. At what scale would you claim homogeneity is reached? Not when the flow is exactly zero, as there will always be fluctuations, but exactly how small?

The same goes for all the other possible criteria we have for judging cosmological homogeneity. We are free to choose the point where we say the level of inhomogeneity is sufficiently small to be satisfactory.

In fact, the standard cosmology (or at least the simplest version of it) has the peculiar property that it doesn’t ever reach homogeneity anyway! If the spectrum of primordial perturbations is scale-free, as is usually supposed, then the metric fluctuations don’t vary with scale at all. In fact, they’re fixed at a level of \delta \Phi/c^2 \sim 10^{-5}.

The fluctuations are small, so the FLRW metric is pretty accurate, but don’t get smaller with increasing scale, so there is no point when it’s exactly true. So lets have no more of “the scale of homogeneity” as if that were a meaningful phrase. Let’s keep the discussion to the behaviour of suitably defined measurable quantities and how they vary with scale. You know, like real scientists do.

34 Responses to “Cosmic Clumpiness Conundra”

  1. Anton Garrett Says:

    Yes, an estimate of the scale of homogeneity is a sort-of probability of a probability – a standard no-no in the only sensible view of probabilistics.

    Fascinating! Are there enough galaxies in the universe for talk of fractal distributions to be meaningful?

    • Anton Garrett Says:

      Wikipedia reckons that there are upward of 170 * (10**9) galaxies. In 3D this means that, if the universe were a cube, there are 5500 galaxies along a side. Given that fractals are in geometric progression of diminishing size of the repeating structure, it’s not clear to me that there are enough galaxies to meaningfully test a fractal vs a nonfractal distribution; if the geometric progression has common ratio = 10 then you can get only 3 levels of repetition, which I would not say is enough to meet the definition of a fractal.

      • telescoper Says:

        I tried to post a reply earlier but my connection failed. See the comment below.

        Also, we don’t see all the galaxies, at large distances only the brighter ones, introducing a selection bias, and survey volumes usually have a complicated geometry, highly flattened or long like a pencil. These make things even harder.

        I agree that it’s hard to defend an argument that the Universe is fractal, but that’s a useful model we can use to ask the question whether what we see is consistent with large-scale homogeneity or with an inhomogeneous alternative (i.e. the fractal).

  2. telescoper Says:

    The Universe may well be infinite, but if it began a finite time in the past we can’t observe more than a bit of it. We do now have surveys of millions of galaxies and they suggest that clustering is roughly self-similar over a certain range of scales, but it’s only approximate and the dynamical range is relatively small in logarithmic terms, i.e. about two orders of magnitude.

    • Anton Garrett Says:

      If the universe began a finite time T ago then how can it be infinite? Surely it can’t be larger than cT?

      I suspected that my fractal reasoning above would run onto the rocks of astrophysical reality, as I am not an astrophysicist, but it’s always interesting to see how and learn a liltle more. I am expecting the same here…

  3. >If the universe began a finite time T ago then how can it be infinite? Surely it can’t be larger than cT?

    The Universe is infinite, but the observable Universe is finite (if the cosmological model is correct and we believe we have established it to be spatially flat)

    • telescoper Says:

      We don’t know if the Universe is infinite or not. Even if the Universe is open or flat it could still be finite, but with a strange topology.

      Anton: it’s quite possible for the Universe to be infinite in spatial extent but finite in past duration. The part we can see grows with time as ct but there could be an infinite universe beyond our horizon. We can’t see it, though.

    • Anton Garrett Says:

      Peter: in that case 2 questions, if you are willing:

      1. If the universe is spatially infinite yet not older than epoch T then it can’t have started from a point, is that right? (Since \infty – cT > 0)

      2. Can we in principle infer any details of the parts we cannot see?

  4. >1. If the universe is spatially infinite yet not older than epoch T then it can’t have started from a point, is that right? (Since \infty – cT > 0)

    If the current cosmological model is correct and it is spatially flat, it was always infinite, it never was a “point”.

  5. >Big Bang is dead then? I’m way behind the times.

    No – it isn’t – what I said is the standard interpretation of the Big Bang. The BB is seriously mis-portrayed in the public arena.

    • Anton Garrett Says:

      Cusp: I’ve been a postdoc in theoretical physics depts but am not an astrophysicist; that’s my level. If the Big Bang is a goer then surely the universe starts from a point and can be no larger at a time T after that than cT?

  6. If the universe is spatially flat, it is infinite in extent and always has been – there was no point in its history did it go from finite to infinite. It was born infinite.

    The observable universe, the part from which we could have received light from, is a finite part of an infinite universe. It started “as a point”.

    The Universe and the Observable Universe often get lumped into one, when they are quite different things.

    I had to explain this to a quantum physicist the other day – he thought it was cool when he finally got it.

  7. telescoper Says:

    Anton,

    It might help – I find it does in such things – to think of it all back-to-front. Think about an infinite sheet of graph paper representing a flat space, with the grid representing the distribution of galaxies. Now gradually reduce the scale: the squares get smaller and smaller. Eventually the scale becomes arbitrarily small so that everything is densely packed. But the sheet is still infinite.

    It’s probably also worth saying that the Universe could be finite, like the 3D anaologue of the surface of a sphere, but its radius could be much larger than ct.

    You ask whether one can make inferences about what happens outside our horizon – well, people do. However, if the Cosmological Principle holds true then what’s outside our horizon is pretty much the same as what’s inside!

    • Anton Garrett Says:

      Thanks Peter, that’s a step forward, but if the graph paper is sometihng physical rather than merely a coordinate system of our choosing, what IS it? And what caused abandonment of the simple Big Bang? (And how much would Fred Hoyle be laughing?)

      • telescoper Says:

        Also, it’s not quite “a coordinate system of our choosing”. It’s chosen so that the distribution of matter looks homogeneous and isotropic in the coordinate frame. This also gives a preferred time coordinate – we can slice space-time in such a way that surfaces of constant density are synchronous.

    • telescoper Says:

      There’s no “abandonment of the simple Big Bang”. The picture we’re discussing is precisely the same as that presented by Friedmann and Lemaitre in the 20s.

      • I agree with Peter – What we have been saying *is* the standard Big Bang. What needs to be abandoned is the misconceptions about it. I’m sure Fred Hoyle knew this.

        You might want to start by reading;

        Expanding Confusion: common misconceptions of cosmological horizons and the superluminal expansion of the Universe
        Tamara M. Davis, Charles H. Lineweaver
        http://xxx.lanl.gov/abs/astro-ph/0310808

        and if you want to, follow it up with

        Expanding Space: the Root of all Evil?
        Matthew J. Francis, Luke A. Barnes, J. Berian James, Geraint F. Lewis
        http://xxx.lanl.gov/abs/0707.0380

        I very much recommend the second one 🙂

      • Anton Garrett Says:

        But I (thought I) understood that! Let’s discuss at Lords…

      • telescoper Says:

        I’m in Copenhagen now, and will be back again in August.

  8. > Let’s discuss at Lords…

    Alas, I won’t be at Lords – and, to quote Dreadlock Holiday, “I don’t like cricket”

    • Anton Garrett Says:

      That comment of mine was posted so as to show here as a response to something said by Peter, who WILL be at Lords with me.

      Glad you *love* cricket!

  9. These hyperclusters stretching over 3 billion light years would require over 100 billion years to form. Like the sloan great wall, a vast cosmic filament is associated. The dark flow is believed to be 150 billion light years away, and would indicate that plasma structures are fractal out to larger scales. Alfven proposed 26 fractal plasma mediums, which includes the galactic corona magnetic bubbles surrounding the milky way, and galaxy clusters having the hottest densest medium known. The IGM is believed to contain most of the baryonic matter of the universe, and the WHIM filaments about half the mass of the universe. Jets extend the lengths of galaxies, and most galaxies are nearby relative to their sizes apart. Only by seeing more of the sky with the SDSS, were they able to detect these hyperclusters. Like atoms, stars, galaxies, superclusters… it seems that there is no law or rule where the smallest plasma particle nor largest structure exists. There are always smaller particles and larger structures, each having their own relative time. Imperance, change, transitoriness takes place with everything. Size is relative to other objects.
    http://holographicgalaxy.blogspot.com
    http://hologramuniverse.wordpress.com

  10. […] redshifts. These most distant and oldest known galaxies are forming Hyperclusters !   cosmic clumpiness conundra 12 billion light year scale view by BOSS   Filamentary Emission by a Rat Cell Milky Way Satellite […]

  11. […] redshifts. These most distant and oldest known galaxies are forming Hyperclusters !   cosmic clumpiness conundra 12 billion light year scale view by BOSS   Filamentary Emission by a Rat Cell Milky Way Satellite […]

  12. […] is cosmic inhomogeneity on even larger scales, of course, but in such cases the “peculiar velocities” generated by the lumpiness can […]

  13. The scalelength R at which the Universe is homogenous (10, 100, 1000 Mpc h^-1 ?) is a comoving length, right?

    Does it mean that this scale is (1+z) smaller at a redshift z?
    Thanks

  14. Thanks for the (quick!) reply.

    R varies like 1 / (1+z) right?

    I am also curious about the latest estimates of this scalelength. Has a consensus been reach as far as you know?
    cheers

  15. […] me that I never completed the story I started with a couple of earlier posts (here and there), so while I wait for the rain to stop I thought I’d make myself useful by posting something […]

  16. […] do we find strong evidence against leftover relics and topological defects, but we measured this Harrison-Zel’dovich spectrum very accurately back in the 1990s, which was predicted by inflation more than a decade before it was observed! In […]

  17. […] violates the cosmological principle even in the standard model: with scale-invariant perturbations there is no scale at which the Universe is completely homogeneous. The question is really how much and in what way it is violated. We seem to be happy with 10-5 but […]

  18. […] there is no scale at which the Universe is completely smooth. See the discussion, for example, here. We can see correlations on very large angular scales in the cosmic microwave background which […]

Leave a comment