Archive for cosmological distance scale

Hubble Constant Catch-Up

Posted in Bad Statistics, The Universe and Stuff with tags , , , , on May 2, 2018 by telescoper

Last week when I wrote about the 2nd Data Release from Gaia, somebody emailed me to ask whether the new results said anything about the cosmological distance ladder and hence the Hubble Constant. As far as I could see, no scientific papers were released on this topic at the time and I thought there probably wasn’t anything definitive at this stage. However, it turns out that there is a paper now, by Riess et al., which focuses on the likely impact of Gaia on the Cepheid distance scale. Here is the abstract:

We present HST photometry of a selected sample of 50 long-period, low-extinction Milky Way Cepheids measured on the same WFC3 F555W, F814W, and F160W-band photometric system as extragalactic Cepheids in SN Ia hosts. These bright Cepheids were observed with the WFC3 spatial scanning mode in the optical and near-infrared to mitigate saturation and reduce pixel-to-pixel calibration errors to reach a mean photometric error of 5 millimags per observation. We use the new Gaia DR2 parallaxes and HST photometry to simultaneously constrain the cosmic distance scale and to measure the DR2 parallax zeropoint offset appropriate for Cepheids. We find a value for the zeropoint offset of -46 +/- 13 muas or +/- 6 muas for a fixed distance scale, higher than found from quasars, as expected, for these brighter and redder sources. The precision of the distance scale from DR2 has been reduced by a factor of 2.5 due to the need to independently determine the parallax offset. The best fit distance scale is 1.006 +/- 0.033, relative to the scale from Riess et al 2016 with H0=73.24 km/s/Mpc used to predict the parallaxes photometrically, and is inconsistent with the scale needed to match the Planck 2016 CMB data combined with LCDM at the 2.9 sigma confidence level (99.6%). At 96.5% confidence we find that the formal DR2 errors may be underestimated as indicated. We identify additional error associated with the use of augmented Cepheid samples utilizing ground-based photometry and discuss their likely origins. Including the DR2 parallaxes with all prior distance ladder data raises the current tension between the late and early Universe route to the Hubble constant to 3.8 sigma (99.99 %). With the final expected precision from Gaia, the sample of 50 Cepheids with HST photometry will limit to 0.5% the contribution of the first rung of the distance ladder to the uncertainty in the Hubble constant.

So, nothing definitive yet but potentially very interesting in the future and this group, led by Adam Riess, is now claiming a 3.8σ tension between measurements of the Hubble constant from cosmic microwave background measurements and from traditional `distance ladder’ approaches, though to my mind this is based on some rather subjective judgements.

The appearance of that paper reminded me that I forgot to post about a paper by Bernal & Peacock that appeared a couple of months ago. Here is the abstract of that one:

When combining data sets to perform parameter inference, the results will be unreliable if there are unknown systematics in data or models. Here we introduce a flexible methodology, BACCUS: BAyesian Conservative Constraints and Unknown Systematics, which deals in a conservative way with the problem of data combination, for any degree of tension between experiments. We introduce hyperparameters that describe a bias in each model parameter for each class of experiments. A conservative posterior for the model parameters is then obtained by marginalization both over these unknown shifts and over the width of their prior. We contrast this approach with an existing hyperparameter method in which each individual likelihood is scaled, comparing the performance of each approach and their combination in application to some idealized models. Using only these rescaling hyperparameters is not a suitable approach for the current observational situation, in which internal null tests of the errors are passed, and yet different experiments prefer models that are in poor agreement. The possible existence of large shift systematics cannot be constrained with a small number of data sets, leading to extended tails on the conservative posterior distributions. We illustrate our method with the case of the H0 tension between results from the cosmic distance ladder and physical measurements that rely on the standard cosmological model.

This paper addresses the long-running issue of apparent tension in different measurements of the Hubble constant that I’ve blogged about before (e.g. here) by putting the treatment of possible systematic errors into a more rigorus and consistent (i.e. Bayesian) form. It says what I think most people in the community privately think about this issue, i.e. that it’s probably down to some sort of unidentified systematic rather than exotic physics.

The title of the paper includes the phrase `Conservative Cosmology’, but I think that’s a bit of a misnomer. I think `Sensible Cosmology’. Current events suggest `conservative’ and `sensible’ have opposite meanings. You can find a popular account of it here, from which I have stolen this illustration of the tension:

A chart showing the two differing results for the Hubble constant – The expansion rate of the universe (in km/s/Mpc)
Result 1: 67.8 ± 0.9 Cosmic microwave background
Result 2: 73.52 ± 1.62 Cosmic distance ladder

Anyway, I have a poll that has been going on for some time about whether this tension is anything to be excited about, so why not use this opportunity cast your vote?

Advertisements

Cosmology at a Crossroads – Poll

Posted in The Universe and Stuff with tags , , , on June 13, 2017 by telescoper

A short comment piece by Wendy Freedman has appeared in Nature Astronomy; there’s a free version on the arXiv here. It gives a nice perspective on the current debate about the value of the Hubble constant from the point of view of an expert on cosmological distance scale measurements.

The abstract is here:

We are at an interesting juncture in cosmology. With new methods and technology, the accuracy in measurement of the Hubble constant has vastly improved, but a recent tension has arisen that is either signaling new physics or as-yet unrecognized uncertainties.

For the record, I’d go for `as-yet unrecognized uncertainties’, primarily because this field has a long history of drastically underestimated error-bars!

However, the publication of this piece gives me the excuse to resurrect the following poll, in which I invite you to participate:

Getting the Measure of Space

Posted in The Universe and Stuff with tags , , , , , , , on October 8, 2014 by telescoper

Astronomy is one of the oldest scientific disciplines. Human beings have certainly been fascinated by goings-on in the night sky since prehistoric times, so perhaps astronomy is evidence that the urge to make sense of the Universe around us, and our own relationship to it, is an essential part of what it means to be human. Part of the motivation for astronomy in more recent times is practical. The regular motions of the stars across the celestial sphere help us to orient ourselves on the Earth’s surface, and to navigate the oceans. But there are deeper reasons too. Our brains seem to be made for problem-solving. We like to ask questions and to try to answer them, even if this leads us into difficult and confusing conceptual territory. And the deepest questions of all concern the Cosmos as a whole. How big is the Universe? What is it made of? How did it begin? How will it end? How can we hope to answer these questions? Do these questions even make sense?

The last century has witnessed a revolution in our understanding of the nature of the Universe of space and time. Huge improvements in the technology of astronomical instrumentation have played a fundamental role in these advances. Light travels extremely quickly (around 300,000 km per second) but we can now see objects so far away that the light we gather from them has taken billions of years to reach our telescopes and detectors. Using such observations we can tell that the Universe was very different in the past from what it looks like in the here and now. In particular, we know that the vast agglomerations of stars known as galaxies are rushing apart from one another; the Universe is expanding. Turning the clock back on this expansion leads us to the conclusion that everything was much denser in the past than it is now, and that there existed a time, before galaxies were born, when all the matter that existed was hotter than the Sun.

This picture of the origin and evolution is what we call the Big Bang, and it is now so firmly established that its name has passed into popular usage. But how did we arrive at this description? Not by observation alone, for observations are nothing without a conceptual framework within which to interpret them, but through a complex interplay between data and theoretical conjectures that has taken us on a journey with many false starts and dead ends and which has only slowly led us to a scheme that makes conceptual sense to our own minds as well as providing a satisfactory fit to the available measurements.

A particularly relevant aspect of this process is the establishment of the scale of astronomical distances. The basic problem here is that even the nearest stars are too remote for us to reach them physically. Indeed most stars can’t even be resolved by a telescope and are thus indistinguishable from points of light. The intensity of light received falls off as the inverse-square of the distance of the source, so if we knew the luminosity of each star we could work out its distance from us by measuring how much light we detect. Unfortunately, however, stars vary considerably in luminosity from one to another. So how can we tell the difference between a dim star that’s relatively nearby and a more luminous object much further away?

Over the centuries, astronomers have developed a battery of techniques to resolve this tricky conundrum. The first step involves the fact that terrestrial telescopes share the Earth’s motion around the Sun, so we’re not actually observing stars in the sky from the same vantage point all year round. Observed from opposite extremes of the Earth’s orbit (i.e. at an interval of six months) a star appears to change position in the sky, an effect known as parallax. If the size of the Earth’s orbit is known, which it is, an accurate measurement of the change of angular position of the star can yield its distance.

The problem is that this effect is tiny, even for nearby stars, and it is immeasurably small for distant ones. Nevertheless, this method has successfully established the first “rung” on a cosmic distance ladder. Sufficiently many stellar distances have been measured this way to enable astronomers to understand and classify different types of star by their intrinsic properties. A particular type of variable star called a Cepheid variable emerged from these studies as a form of “standard candle”; such a star pulsates with a well-defined period that depends on its intrinsic brightness so by measuring the time-variation of its apparent brightness we can tell how bright it actually is, and hence its distance. Since these stars are typically very luminous they can be observed at great distances, which can be accurately calibrated using measured parallaxes of more nearby examples.

Cepheid variables are not the only distance indicators available to astronomers, but they have proved particularly important in establishing the scale of our Universe. For centuries astronomers have known that our own star, the Sun, is just one of billions arranged in an enormous disk-like structure, our Galaxy, called the Milky Way. But dotted around the sky are curious objects known as nebulae. These do not look at all like stars; they are extended, fuzzy, objects similar in shape to the Milky Way. Could they be other galaxies, seen at enormous distances, or are they much smaller objects inside our own Galaxy?

Only a century ago nobody really knew the answer to that question. Eventually, after the construction of more powerful telescopes, astronomers spotted Cepheid variables in these nebulae and established that they were far too distant to be within the Milky Way but were in fact structures like our own Galaxy. This realization revealed the Cosmos to be much larger than most astronomers had previously imagined; conceptually speaking, the Universe had expanded. Soon, measurements of the spectra of light coming from extragalactic nebulae demonstrated that the Universe was actually expanding physically too. The evidence suggested that all distant galaxies were rushing away from our own with speed proportional to their distance from us, an effect now known as Hubble’s Law, after the astronomer Edwin Hubble who played a major role in its discovery.

A convincing theoretical interpretation of this astonishing result was only found with the adoption of Einstein’s General Theory of Relativity, a radically new conception of how gravity manifests itself as an effect of the behaviour of space-time. Whereas previously space and time were regarded as separate and absolute notions, providing an unchanging and impassive stage upon which material bodies interact, after Einstein space-time became a participant in the action, both influencing, and being influenced, by matter in motion. The space that seemed to separate galaxies from one another, was now seen to bind them together.
Hubble’s Law emerges from this picture as a natural consequence an expanding Universe, considered not as a collection of galaxies moving through static space but embedded in a space which is itself evolving dynamically. Light rays get bent and distorted as they travel through, and are influenced by, the changing landscape of space-time the encounter along their journey.

Einstein’s theory provides the theoretical foundations needed to construct a coherent framework for the interpretation of observations of the most distant astronomical objects, but only at the cost of demanding a radical reformulation of some fundamental concepts. The idea of space as an entity, with its own geometry and dynamics, is so central to general relativity that one can hardly avoid asking what it is space in itself, i.e. what is its nature? Outside astronomy we tend to regard space as being the nothingness that lies in between the “things” (i.e. material bodies of one sort or another). Alternatively, when discussing a building (such as an art gallery) “a space” is usually described in terms of the boundaries enclosing it or by the way it is lit; it does not have attributes of its own other than those it derives from something else. But space is not simply an absence of things. If it has geometry and dynamics it has to be something rather than nothing, even if the nature of that something is extremely difficult to grasp.

Recent observations, for example, suggest that even a pure vacuum of “empty space” possesses “dark energy” energy of its own. This inference hinges on the type Ia supernova, a type of stellar explosion so luminous it can (briefly) outshine an entire galaxy before gradually fading away. These cataclysmic events can be used as distance indicators because their peak brightness correlates with the rate at which they fade. Type Ia supernovae can be detected at far greater distances than Cepheids, at such huge distances in fact that the Universe might be only about half its current size when light set out from them. The problem is that the more distant supernovae look fainter, and consequently at greater distances, than expected if the expansion of the Universe were gradually slowing down, as it should if there were no dark energy.

At present there is no theory that can fully account for the existence of vacuum energy, but it is possible that it might eventually be explained by the behaviour of the quantum fields that arise in the theory of elementary particles. This could lead to a unified description of the inner space of subatomic matter and the outer space of general relativity, which has been the goal of many physicists for a considerable time. That would be a spectacular achievement but, as with everything else in science, it will only work out if we have the correct conceptual framework.