Archive for Hubble constant

Hubble Problems

Posted in Cute Problems, The Universe and Stuff with tags , on October 12, 2018 by telescoper

Here I am, only connecting again.

Almost every day I get a spam message from a certain person who thinks he can determine the Hubble constant from first principles using  biblical references. The preceding link takes you to an ebook. I was thinking of buying it, but at 99c* I considered it prohibitively expensive.

*I am informed that it has now gone up to £1.30.

My correspondent also alleges that in writing this blog I am doing the Devil’s work. That may be the case, of course, but I can’t help thinking that there must be more effective ways for him to get his work done. Either that or he’s remarkably unambitious.

Anyway, to satisfy my correspondent here is one for the problems folder:

Using  the information provided in Isaiah Chapter 40 verse 22, show that the value of the Hubble constant is precisely 70.98047 km s-1 Mpc-1.

You may quote the relevant biblical verse without proof. In the King James version it reads:

40.22. It is he that sitteth upon the circle of the earth, and the inhabitants thereof are as grasshoppers; that stretcheth out the heavens as a curtain, and spreadeth them out as a tent to dwell in.

By the way, please note that the inverse of the Hubble constant has dimensions of time, not distance.

Answers into my spam folder please (via the comments box).

 

While I am on the subject of Hubble, I will mention the news that the Hubble Space Telescope is having a few technical problems as a result of a failure of one of its gyros. In fact a few days ago it went into `safe mode’ to help engineers diagnose and fix the problem, during which time no observations are being taken. I’m told by people who know about such things that the spacecraft can actually operate on only one gyro if necessary, using information from other systems for attitude control, so this problem is not going to be terminal, but it will slow down the pointing quite a bit thus make it less efficient. With a bit of luck HST will be back in operation soon.

 

 

 

 

Advertisements

Ongoing Hubble Constant Poll

Posted in The Universe and Stuff with tags , , , , on July 18, 2018 by telescoper

Here are two interesting plots that I got via Renée Hložek on Twitter from the recent swathe of papers from Planck The first shows the `tension’ between Planck’s parameter estimates `direct’ measurements of the Hubble Constant (as exemplified by Riess et al. 2018); see my recent post for a discussion of the latter. Planck actually produces joint estimates for a set of half-a-dozen basic parameters from which estimates of others, including the Hubble constant, can be derived. The plot  below shows the two-dimensional region that is allowed by Planck if both the Hubble constant (H0) and the matter density parameter (ΩM) are allowed to vary within the limits allowed by various observations. The tightest contours come from Planck but other cosmological probes provide useful constraints that are looser but consistent; `BAO’ refers to `Baryon Acoustic Oscillations‘, and `Pantheon’ is a sample of Type Ia supernovae.

You can see that the Planck measurements (blue) mean that a high value of the Hubble constant requires a low matter density but the allowed contour does not really overlap with the grey shaded horizontal regions. For those of you who like such things, the discrepancy is about 3.5σ..

Another plot you might find interesting is this one:

The solid line shows how the Hubble `constant’ varies with redshift in the standard cosmological model; H0 is the present value of a redshift-dependent parameter H(z) that measures the rate at which the Universe is expanding. You will see that the Hubble parameter is larger at high redshift, but decreases as the expansion of the Universe slows down, until a redshift of around 0.5 and then it increases, indicating that the expansion of the Universe is accelerating.  Direct determinations of the expansion rate at high redshift are difficult, hence the large error bars, but the important feature is the gap between the direct determination at z=0 and what the standard model predicts. If the Riess et al. 2018 measurements are right, the expansion of the Universe seems to have been accelerating more rapidly than the standard model predicts.

So after that little update here’s a little poll I’ve been running for a while on whether people think this apparent discrepancy is serious or not. I’m interested to see whether these latest findings change the voting!

Hubble Constant Catch-Up

Posted in Bad Statistics, The Universe and Stuff with tags , , , , on May 2, 2018 by telescoper

Last week when I wrote about the 2nd Data Release from Gaia, somebody emailed me to ask whether the new results said anything about the cosmological distance ladder and hence the Hubble Constant. As far as I could see, no scientific papers were released on this topic at the time and I thought there probably wasn’t anything definitive at this stage. However, it turns out that there is a paper now, by Riess et al., which focuses on the likely impact of Gaia on the Cepheid distance scale. Here is the abstract:

We present HST photometry of a selected sample of 50 long-period, low-extinction Milky Way Cepheids measured on the same WFC3 F555W, F814W, and F160W-band photometric system as extragalactic Cepheids in SN Ia hosts. These bright Cepheids were observed with the WFC3 spatial scanning mode in the optical and near-infrared to mitigate saturation and reduce pixel-to-pixel calibration errors to reach a mean photometric error of 5 millimags per observation. We use the new Gaia DR2 parallaxes and HST photometry to simultaneously constrain the cosmic distance scale and to measure the DR2 parallax zeropoint offset appropriate for Cepheids. We find a value for the zeropoint offset of -46 +/- 13 muas or +/- 6 muas for a fixed distance scale, higher than found from quasars, as expected, for these brighter and redder sources. The precision of the distance scale from DR2 has been reduced by a factor of 2.5 due to the need to independently determine the parallax offset. The best fit distance scale is 1.006 +/- 0.033, relative to the scale from Riess et al 2016 with H0=73.24 km/s/Mpc used to predict the parallaxes photometrically, and is inconsistent with the scale needed to match the Planck 2016 CMB data combined with LCDM at the 2.9 sigma confidence level (99.6%). At 96.5% confidence we find that the formal DR2 errors may be underestimated as indicated. We identify additional error associated with the use of augmented Cepheid samples utilizing ground-based photometry and discuss their likely origins. Including the DR2 parallaxes with all prior distance ladder data raises the current tension between the late and early Universe route to the Hubble constant to 3.8 sigma (99.99 %). With the final expected precision from Gaia, the sample of 50 Cepheids with HST photometry will limit to 0.5% the contribution of the first rung of the distance ladder to the uncertainty in the Hubble constant.

So, nothing definitive yet but potentially very interesting in the future and this group, led by Adam Riess, is now claiming a 3.8σ tension between measurements of the Hubble constant from cosmic microwave background measurements and from traditional `distance ladder’ approaches, though to my mind this is based on some rather subjective judgements.

The appearance of that paper reminded me that I forgot to post about a paper by Bernal & Peacock that appeared a couple of months ago. Here is the abstract of that one:

When combining data sets to perform parameter inference, the results will be unreliable if there are unknown systematics in data or models. Here we introduce a flexible methodology, BACCUS: BAyesian Conservative Constraints and Unknown Systematics, which deals in a conservative way with the problem of data combination, for any degree of tension between experiments. We introduce hyperparameters that describe a bias in each model parameter for each class of experiments. A conservative posterior for the model parameters is then obtained by marginalization both over these unknown shifts and over the width of their prior. We contrast this approach with an existing hyperparameter method in which each individual likelihood is scaled, comparing the performance of each approach and their combination in application to some idealized models. Using only these rescaling hyperparameters is not a suitable approach for the current observational situation, in which internal null tests of the errors are passed, and yet different experiments prefer models that are in poor agreement. The possible existence of large shift systematics cannot be constrained with a small number of data sets, leading to extended tails on the conservative posterior distributions. We illustrate our method with the case of the H0 tension between results from the cosmic distance ladder and physical measurements that rely on the standard cosmological model.

This paper addresses the long-running issue of apparent tension in different measurements of the Hubble constant that I’ve blogged about before (e.g. here) by putting the treatment of possible systematic errors into a more rigorus and consistent (i.e. Bayesian) form. It says what I think most people in the community privately think about this issue, i.e. that it’s probably down to some sort of unidentified systematic rather than exotic physics.

The title of the paper includes the phrase `Conservative Cosmology’, but I think that’s a bit of a misnomer. I think `Sensible Cosmology’. Current events suggest `conservative’ and `sensible’ have opposite meanings. You can find a popular account of it here, from which I have stolen this illustration of the tension:

A chart showing the two differing results for the Hubble constant – The expansion rate of the universe (in km/s/Mpc)
Result 1: 67.8 ± 0.9 Cosmic microwave background
Result 2: 73.52 ± 1.62 Cosmic distance ladder

Anyway, I have a poll that has been going on for some time about whether this tension is anything to be excited about, so why not use this opportunity cast your vote?

Who’s worried about the Hubble Constant?

Posted in The Universe and Stuff with tags , , , , on January 11, 2018 by telescoper

One of the topics that is bubbling away on the back burner of cosmology is the possible tension between cosmological parameters, especially relating to the determination of the Hubble constant (H0) by Planck and by “traditional” methods based on the cosmological distance ladder; see here for an overview of the latter.

Before getting to the point I should explain that Planck does not determine H0 directly, as it is not one of the six numbers used to specify the minimal model used to fit the data. These parameters do include information about H0, however, so it is possible to extract a value from the data indirectly. In other words it is a derived parameter:

Planck_parameters

The above summary shows that values of the Hubble constant obtained in this way lie around the 67 to 68  km/s/Mpc mark, with small changes if other measures are included. According to the very latest Planck paper on cosmological parameter estimates the headline determination is H0 = (67.8 +/- 0.9) km/s/Mpc.

About 18 months I blogged about a “direct” determination of the Hubble constant by Riess et al.  using Hubble Space Telescope data quotes a headline value of (73.24+/-1.74) km/sec/Mpc, hinting at a discrepancy somewhere around the 3 sigma level depending on precisely which determination you use. A news item on the BBC hot off the press reports that a more recent analysis by the same group is stubbornly sitting around the same value of the Hubble constant, with a slight smaller error so that the discrepancy is now about 3.4σ. On the other hand, the history of this type of study provides grounds for caution because the systematic errors have often turned out to be much larger and more uncertain than the statistical errors…

Nevertheless, I think it’s fair to say that there isn’t a consensus as to how seriously to take this apparent “tension”. I certainly can’t see anything wrong with the Riess et al. result, and the lead author is a Nobel prize-winner, but I’m also impressed by the stunning success of the minimal LCDM model at accounting for such a huge data set with a small set of free parameters.

If one does take this tension seriously it can be resolved by adding an extra parameter to the model or by allowing one of the fixed properties of the LCDM model to vary to fit the data. Bayesian model selection analysis however tends to reject such models on the grounds of Ockham’s Razor. In other words the price you pay for introducing an extra free parameter exceeds the benefit in improved goodness of fit. GAIA may shortly reveal whether or not there are problems with the local stellar distance scale, which may reveal the source of any discrepancy. For the time being, however, I think it’s interesting but nothing to get too excited about. I’m not saying that I hope this tension will just go away. I think it will be very interesting if it turns out to be real. I just think the evidence at the moment isn’t convincing me that there’s something beyond the standard cosmological model. I may well turn out to be wrong.

Anyway, since polls seem to be quite popular these days, so let me resurrect this old one and see if opinions have changed!

 

Determining the Hubble Constant the Bernard Schutz way

Posted in The Universe and Stuff with tags , , , , , on October 19, 2017 by telescoper

In my short post about Monday’s announcement of the detection of a pair of coalescing neutron stars (GW170817), I mentioned that one of the results that caught my eye in particular was the paper about using such objects to determine the Hubble constant.

Here is the key result from that paper, i.e. the posterior distribution of the Hubble constant H0 given the data from GW170817:

You can also see latest determinations from other methods, which appear to be in (slight) tension; you can read more about this here. Clearly the new result from GW170817 yields a fairly broad range for H0 but, as I said in my earlier post, it’s very impressive to be straddling the target with the first salvo.

Anyway, I just thought I’d mention here that the method of measuring the Hubble constant using coalescing binary neutron stars was invented by none other than Bernard Schutz of Cardiff University, who works in the Data Innovation Institute (as I do). The idea was first published in September 1986 in a Letter to Nature. Here is the first paragraph:

I report here how gravitational wave observations can be used to determine the Hubble constant, H 0. The nearly monochromatic gravitational waves emitted by the decaying orbit of an ultra–compact, two–neutron–star binary system just before the stars coalesce are very likely to be detected by the kilometre–sized interferometric gravitational wave antennas now being designed1–4. The signal is easily identified and contains enough information to determine the absolute distance to the binary, independently of any assumptions about the masses of the stars. Ten events out to 100 Mpc may suffice to measure the Hubble constant to 3% accuracy.

In in the paper, Bernard points out that a binary coalescence — such as the merger of two neutron stars — is a self calibrating `standard candle’, which means that it is possible to infer directly the distance without using the cosmic distance ladder. The key insight is that the rate at which the binary’s frequency changes is directly related to the amplitude of the gravitational waves it produces, i.e. how `loud’ the GW signal is. Just as the observed brightness of a star depends on both its intrinsic luminosity and how far away it is, the strength of the gravitational waves received at LIGO depends on both the intrinsic loudness of the source and how far away it is. By observing the waves with detectors like LIGO and Virgo, we can determine both the intrinsic loudness of the gravitational waves as well as their loudness at the Earth. This allows us to directly determine distance to the source.

It may have taken 31 years to get a measurement, but hopefully it won’t be long before there are enough detections to provide greater precision – and hopefully accuracy! – than the current methods can manage!

Above all, congratulations to Bernard for inventing a method which has now been shown to work very well!

Cosmology at a Crossroads – Poll

Posted in The Universe and Stuff with tags , , , on June 13, 2017 by telescoper

A short comment piece by Wendy Freedman has appeared in Nature Astronomy; there’s a free version on the arXiv here. It gives a nice perspective on the current debate about the value of the Hubble constant from the point of view of an expert on cosmological distance scale measurements.

The abstract is here:

We are at an interesting juncture in cosmology. With new methods and technology, the accuracy in measurement of the Hubble constant has vastly improved, but a recent tension has arisen that is either signaling new physics or as-yet unrecognized uncertainties.

For the record, I’d go for `as-yet unrecognized uncertainties’, primarily because this field has a long history of drastically underestimated error-bars!

However, the publication of this piece gives me the excuse to resurrect the following poll, in which I invite you to participate:

A New Measurement of the Expansion Rate of the Universe – Adam Riess

Posted in The Universe and Stuff with tags , , on May 14, 2017 by telescoper

Here’s a nice talk by Nobel Laureate Adam Riess delivered on May 11th at the Harvard-Smithsonian Centre Center for Astrophysics and is now available for you to watch at your leisure. It’s an hour long, but well worth watching if you’re interested in cosmology in general and in apparent tension between different determinations of the Hubble constant in particular.

Here’s the description of the talk, which is introduced first by Bach and Daniel Eisenstein:

The Hubble constant remains one of the most important parameters in the cosmological model, setting the size and age scales of the Universe. Present uncertainties in the cosmological model including the nature of dark energy, the properties of neutrinos and the scale of departures from flat geometry can be constrained by measurements of the Hubble constant made to higher precision than was possible with the first generations of Hubble Telescope instruments. A streamlined distance ladder constructed from infrared observations of Cepheids and type Ia supernovae with ruthless attention paid to systematics now provide 2.4% precision and offer the means to do much better. By steadily improving the precision and accuracy of the Hubble constant, we now see evidence for significant deviations from the standard model, referred to as LambdaCDM, and thus the exciting chance, if true, of discovering new fundamental physics such as exotic dark energy, a new relativistic particle, or a small curvature to name a few possibilities. I will review recent and expected progress.

And here’s the talk in full.

After watching the video you be interested in voting in my totally unscientific poll on the matter: