The full paper (i.e. author list plus a small amount of text) can be found here. Here are two plots from that work.
The first shows the constraints from the six loudest gravitational wave events selected for the latest work, together with the two competing measurements from Planck and SH0ES:
As you can see the individual measurements do not constrain very much. The second plot shows the effect of combining all relevant data, including a binary neutron star merger with an electromagnetic counterparts. The results are much stronger when the latter is included
Obviously this measurement isn’t yet able to resolve the alleged tension between “high” and “low” values described on this blog passim, but it’s early days. If LIGO reaches its planned sensitivity the next observing run should provide many more events. A few hundred should get the width of the posterior distribution shown in the second figure down to a few percent, which would be very interesting indeed!
I have to admit I haven’t really kept up with developments in the world of gravitational waves this summer, though there have been a number of candidate events reported in the third observing run (O3) of Advanced LIGO which began in April 2019 to which I refer you if you’re interested.
The LIGO Scientific Collaboration and the Virgo Collaboration have cataloged eleven confidently detected gravitational-wave events during the first two observing runs of the advanced detector era. All eleven events were consistent with being from well-modeled mergers between compact stellar-mass objects: black holes or neutron stars. The data around the time of each of these events have been made publicly available through the Gravitational-Wave Open Science Center. The entirety of the gravitational-wave strain data from the first and second observing runs have also now been made publicly available. There is considerable interest among the broad scientific community in understanding the data and methods used in the analyses. In this paper, we provide an overview of the detector noise properties and the data analysis techniques used to detect gravitational-wave signals and infer the source properties. We describe some of the checks that are performed to validate the analyses and results from the observations of gravitational-wave events. We also address concerns that have been raised about various properties of LIGO-Virgo detector noise and the correctness of our analyses as applied to the resulting data.
It’s an interesting paper that gives quite a lot of detail, especially about signal extraction and parameter-fitting, so it’s very well worth reading.
Two particular things caught my eye about this. One is that there’s no list of authors anywhere in the paper, which seems a little strange. This policy may not be new, of course. I did say I haven’t really been keeping up.
The other point I’ll mention relates to this Figure, the caption of which refers to paper [41], the famous `Danish paper‘:
The Fourier phase is plotted vertically (between 0 and 2π) and the frequency horizontally. A random-phase distribution should have the phases uniformly distributed at each frequency. I think we can agree, without further statistical analysis, that the blue points don’t have that property! Of course nobody denies that the strongly correlated phases in the un-windowed data are at least partly an artifact of the application of a Fourier transform to a non-stationary time series.
I suppose by showing that using a window function to apodize the data removes phase correlations is meant to represent some form of rebuttal of the claims made in the Danish paper. If so, it’s not very convincing.
For a start the caption just says that after windowing resulting `phases appear randomly distributed‘. Could they not provide some more meaningful statistical statement than a simple eyeball impression? The text says little more:
In addition to causing spectral leakage, improper windowing of the data can result in spurious phase correlations in the Fourier transform. Figure 4 shows a scatter plot of the Fourier phase as a function of frequency … both with and without the application of a window function. The un-windowed data shows a strong phase correlation, while the windowed data does not.
(I added the link to the explanation of `spectral leakage’.)
As I have mentioned before on this blog, the human eye is very poor at distinguishing pattern from randomness. There are some subtleties involved in testing for correlated phases (e.g. because they are periodic) but there are various techniques available: I’ve worked on this myself (see, e.g., here and here.). The phases shown may well be consistent with a uniform random distribution, but I’m surprised the LIGO authors didn’t present a proper statistical analysis of the windowed phases to prove beyond doubt the point they seem to be trying to make.
Then again, later on in the caption, there is a statement that `the phases show some clustering around the 60 Hz power line’. So, on the one hand the phases `appear random’, but on the other hand they’re not. There are other plausible clusters elsewhere too. What about them?
I’m afraid the absence of quantitative detail means I don’t find this a very edifying discussion!
The third observing run for Advanced LIGO – O3 – started on April 1 2019, after 19 months upgrading the detectors. Last night, April 8, saw the first new detection of a candidate gravitational wave source, apparently another black hole binary, dubbed S190408an.
It is anticipated that sources like this will be discovered at a rate of roughly one per week for the (planned) year-long run. Given the likely rate of events the policy of LIGO is now to make data publicly available directly without writing papers first. You can find the data entry for this event here, including this map of its position.
Whether the LIGO Scientific Collaboration will release sufficient data for others to perform a full analysis of the signal remains to be seen, but if the predicted detection rate matches reality, the field is going to move very rapidly from studies of individual events to statistical analysis of large populations. Such is the way of science!
Following yesterday’s post here is a nice video visualization of all the black hole binary mergers so far claimed to have been detected by Advanced LIGO. They’re computer simulations, of course, not actual black holes (which you wouldn’t be able to see). I always thought an Orrery was a clockwork device, rather than a digital computer, but there you go. Poetic license!
I think it’s very sneaky of the LIGO Scientific Collaboration and the Virgo Collaboration to have released two new gravitational wave papers while I was out of circulation fora couple of days, so I’m a bit late on this, but here are links to the new results on the arXiv.
You can click on all the excerpts below to make them bigger.
Here is a summary of the properties of the binary systems involved in the events listed in the above paper:
There are several (four) events in this catalogue that have not previously been announced (or, for that matter, subjected to peer review) despite having been seen in the data some time ago (as far back as 2015). I’m also intrigued by the footnote on the first page which contains the following:
…all candidate events with an estimated false alarm rate (FAR) less than 1 per 30 days
and probability > 0.5 of being of astrophysical origin (see Eq. (10) for the definition) are henceforth denoted with the GW prefix.
The use of false discovery rates is discussed at length here as a corrective to relying on p-values for detections. The criteria adopted here don’t seem all that strong to me.
I’ve been teaching and/or preparing lectures all day today, so I haven’t yet had time to read these papers in detail. I will try to read them over the next few days. In the meantime I would welcome comments through the box about these new results. I wonder if there’ll be any opinions from the direction of Copenhagen?
UPDATE: Here’s a montage of all 10 binary black hole mergers `detected’ so far…
I think it’s safe to say that if GW151266 had been the first to be announced, the news would have been greeted with considerable skepticism!
Following my recent post about the claims and counter-claims concerning the detection (or otherwise) of gravitational waves, I have a couple of updates.
First, a few days ago there appeared a paper on the arXiv by Nielsen et al with the abstract (which I’ve slightly edited for formatting reasons):
We use the Pearson cross-correlation statistic proposed by Liu & Jackson (2016), and employed by Creswell et al. (2017), to look for statistically significant correlations between the LIGO Hanford and Livingston detectors at the time of the binary black hole merger GW150914. We compute this statistic for the calibrated strain data released by LIGO, using both the residuals provided by LIGO and using our own subtraction of a maximum-likelihood waveform that is constructed to model binary black hole mergers in general relativity. To assign a significance to the values obtained, we calculate the cross-correlation of both simulated Gaussian noise and data from the LIGO detectors at times during which no detection of gravitational waves has been claimed. We find that after subtracting the maximum likelihood waveform there are no statistically significant correlations between the residuals of the two detectors at the time of GW150914.
The four authors of this paper are, I believe, either present or former members of the LIGO Collaboration
Meanwhile, the NBI group behind the Cresswell et al. paper challenged by the above paper has issued a statement which you can read here. The group re-iterate points made in the New Scientist article discussed in my recent post. Although the Nielsen et al. paper is not explicitly mentioned in the NBI statement but I’m given to understand that the Danish group does not agree with the conclusions in that paper.
I noticed this morning that this week’s New Scientist cover feature (by Michael Brooks)is entitled Exclusive: Grave doubts over LIGO’s discovery of gravitational waves. The article is behind a paywall – and I’ve so far been unable to locate a hard copy in Maynooth so I haven’t read it yet but it is about the so-called `Danish paper’ that pointed out various unexplained features in LIGO data associated with the first detection of gravitational waves of a binary black hole merger.
I did know this piece was coming, however, as I spoke to the author on the phone some time ago to clarify some points I made in previous blog posts on this issue (e.g. this one and that one). I even ended up being quoted in the article:
Not everyone agrees the Danish choices were wrong. “I think their paper is a good one and it’s a shame that some of the LIGO team have been so churlish in response,” says Peter Coles, a cosmologist at Maynooth University in Ireland.
I stand by that comment, as I think certain members – though by no means all – of the LIGO team have been uncivil in their reaction to the Danish team, implying that they consider it somehow unreasonable that the LIGO results such be subject to independent scrutiny. I am not convinced that the unexplained features in the data released by LIGO really do cast doubt on the detection, but unexplained features there undoubtedly are. Surely it is the job of science to explain the unexplained?
It is an important aspect of the way science works is that when a given individual or group publishes a result, it should be possible for others to reproduce it (or not as the case may be). In normal-sized laboratory physics it suffices to explain the experimental set-up in the published paper in sufficient detail for another individual or group to build an equivalent replica experiment if they want to check the results. In `Big Science’, e.g. with LIGO or the Large Hadron Collider, it is not practically possible for other groups to build their own copy, so the best that can be done is to release the data coming from the experiment. A basic problem with reproducibility obviously arises when this does not happen.
In astrophysics and cosmology, results in scientific papers are often based on very complicated analyses of large data sets. This is also the case for gravitational wave experiments. Fortunately, in astrophysics these days, researchers are generally pretty good at sharing their data, but there are a few exceptions in that field.
Even allowing open access to data doesn’t always solve the reproducibility problem. Often extensive numerical codes are needed to process the measurements and extract meaningful output. Without access to these pipeline codes it is impossible for a third party to check the path from input to output without writing their own version, assuming that there is sufficient information to do that in the first place. That researchers should publish their software as well as their results is quite a controversial suggestion, but I think it’s the best practice for science. In any case there are often intermediate stages between `raw’ data and scientific results, as well as ancillary data products of various kinds. I think these should all be made public. Doing that could well entail a great deal of effort, but I think in the long run that it is worth it.
I’m not saying that scientific collaborations should not have a proprietary period, just that this period should end when a result is announced, and that any such announcement should be accompanied by a release of the data products and software needed to subject the analysis to independent verification.
Given that the detection of gravitational waves is one of the most important breakthroughs ever made in physics, I think this is a matter of considerable regret. I also find it difficult to understand the reasoning that led the LIGO consortium to think it was a good plan only to go part of the way towards open science, by releasing only part of the information needed to reproduce the processing of the LIGO signals and their subsequent statistical analysis. There may be good reasons that I know nothing about, but at the moment it seems to me to me to represent a wasted opportunity.
CLARIFICATION: The LIGO Consortium released data from the first observing run (O1) – you can find it here – early in 2018, but this data set was not available publicly at the time of publication of the first detection, nor when the team from Denmark did their analysis.
I know I’m an extremist when it comes to open science, and there are probably many who disagree with me, so here’s a poll I’ve been running for a year or so on this issue:
Any other comments welcome through the box below!
UPDATE: There is a (brief) response from LIGO (& VIRGO) here.
Interesting post from a gravitational wave researcher, telling the inside story of the latest gravitational wave detection (a binary black hole merger) announced last week.
Detected in June, GW170608 has had a difficult time. It was challenging to analyse, and neglected in favour of its louder and shinier siblings. However, we can now introduce you to our smallest chirp-mass binary black hole system!
The growing family of black holes. From Dawn Finney.
Our family of binary black holes is now growing large. During our first observing run (O1) we found three: GW150914, LVT151012 and GW151226. The advanced detector observing run (O2) ran from 30 November 2016 to 25 August 2017 (with a couple of short breaks). From our O1 detections, we were expecting roughly one binary black hole per month. The first same in January, GW170104, and we have announced the first detection which involved Virgo from August, GW170814, so you might be wondering what happened in-between? Pretty much everything was dropped following the detection of our first…
…black hole mergers detected via gravitational waves, that is. Here are the key measurements for Number 5, codename GW170608. More information can be found here.
On June 8, 2017 at 02:01:16.49 UTC, a gravitational-wave signal from the merger of two stellar-mass black holes was observed by the two Advanced LIGO detectors with a network signal-to-noise ratio of 13. This system is the lightest black hole binary so far observed, with component masses 12+7-2 M⊙ and 7+2-2 M⊙ (90% credible intervals). These lie in the range of measured black hole masses in low-mass X-ray binaries, thus allowing us to compare black holes detected through gravitational waves with electromagnetic observations. The source’s luminosity distance is 340 +140-140Mpc, corresponding to redshift 0.07+0.03-0.03. We verify that the signal waveform is consistent with the predictions of general relativity.
This merger seems to have been accompanied by a lower flux of press releases than previous examples…
I will mention a couple of things, however. One is that the signal-to-noise ratio of this detection is a whopping 32.4, a value that astronomers can usually only dream of! The other is that neutron star coalescence offer the possibility to bypass the traditional `distance ladder’ approaches to get an independent measurement of the Hubble constant. The value obtained is in the range 62 to 107 km s-1 Mpc-1, which is admittedly fairly broad, but is based on only one observation of this type. It is extremely impressive to be straddling the target with the very first salvo.
The LIGO collaboration is over a thousand people. Add to that the staff of no fewer than seventy observatories (including seven in space). With all that’s going in the world, it’s great to see what humans of different nations across the globe can do when they come together and work towards a common goal. Scientific results of this kind will remembered long after the silly ramblings of our politicians and other fools have been forgotten.
I took part in a panel discussion after the results were presented, but sadly I won’t be here to see tomorrow’s papers. I hope people will save cuttings or post weblinks if there are any articles!
UPDATE: Here is a selection of the local press coverage.
As if these thrilling science results weren’t enough I finally managed to meet my old friend and former collaborator Varun Sahni (who was away last week). An invitation to dinner at his house was not to be resisted on my last night here, which explains why I didn’t write a post immediately after the press conference. Still, of got plenty of papers to read on the plane tomorrow so maybe I’ll do something when I get back.
Tomorrow morning I get up early to return to Mumbai for the flight home, and am not likely to be online again until Wednesday UK time.
Thanks to all at IUCAA (and TIFR) for making my stay so pleasant and interesting. It’s been 23 years since I was last here. I hope it’s not so long before I’m back again!
The views presented here are personal and not necessarily those of my employer (or anyone else for that matter).
Feel free to comment on any of the posts on this blog but comments may be moderated; anonymous comments and any considered by me to be vexatious and/or abusive and/or defamatory will not be accepted. I do not necessarily endorse, support, sanction, encourage, verify or agree with the opinions or statements of any information or other content in the comments on this site and do not in any way guarantee their accuracy or reliability.