Archive for LIGO Scientific Collaboration

Phase Correlations and the LIGO Data Analysis Paper

Posted in Bad Statistics, The Universe and Stuff with tags , , , on September 1, 2019 by telescoper

I have to admit I haven’t really kept up with developments in the world of gravitational waves this summer, though there have been a number of candidate events reported in the third observing run (O3) of Advanced LIGO  which began in April 2019 to which I refer you if you’re interested.

I did notice, however, that late last week a new paper from the LIGO Scientific Collaboration and Virgo Collaboration appeared on the arXiv. This is entitled A guide to LIGO-Virgo detector noise and extraction of transient gravitational-wave signals and has the following abstract:

The LIGO Scientific Collaboration and the Virgo Collaboration have cataloged eleven confidently detected gravitational-wave events during the first two observing runs of the advanced detector era. All eleven events were consistent with being from well-modeled mergers between compact stellar-mass objects: black holes or neutron stars. The data around the time of each of these events have been made publicly available through the Gravitational-Wave Open Science Center. The entirety of the gravitational-wave strain data from the first and second observing runs have also now been made publicly available. There is considerable interest among the broad scientific community in understanding the data and methods used in the analyses. In this paper, we provide an overview of the detector noise properties and the data analysis techniques used to detect gravitational-wave signals and infer the source properties. We describe some of the checks that are performed to validate the analyses and results from the observations of gravitational-wave events. We also address concerns that have been raised about various properties of LIGO-Virgo detector noise and the correctness of our analyses as applied to the resulting data.

It’s an interesting paper that gives quite a lot of detail, especially about signal extraction and parameter-fitting, so it’s very well worth reading.

Two particular things caught my eye about this. One is that there’s no list of authors anywhere in the paper, which seems a little strange. This policy may not be new, of course. I did say I haven’t really been keeping up.

The other point I’ll mention relates to this Figure, the caption of which refers to paper [41], the famous `Danish paper‘:

The Fourier phase is plotted vertically (between 0 and 2π) and the frequency horizontally. A random-phase distribution should have the phases uniformly distributed at each frequency. I think we can agree, without further statistical analysis,  that the blue points don’t have that property!  Of course nobody denies that the strongly correlated phases  in the un-windowed data are at least partly an artifact of the application of a Fourier transform to a non-stationary time series.

I suppose by showing that using a window function to apodize the data removes phase correlations is meant to represent some form of rebuttal of the claims made in the Danish paper. If so, it’s not very convincing.

For a start the caption just says that after windowing resulting `phases appear randomly distributed‘. Could they not provide some more meaningful statistical statement than a simple eyeball impression? The text says little more:

In addition to causing spectral leakage, improper windowing of the data can result in spurious phase correlations in the Fourier transform. Figure 4 shows a scatter plot of the Fourier phase as a function of frequency … both with and without the application of a window function. The un-windowed data shows a strong phase correlation, while the windowed data does not.

(I added the link to the explanation of `spectral leakage’.)

As I have mentioned before on this blog, the human eye is very poor at distinguishing pattern from randomness. There are some subtleties involved in testing for correlated phases (e.g. because they are periodic) but there are various techniques available: I’ve worked on this myself (see, e.g., here and here.). The phases shown may well be consistent with a uniform random distribution, but I’m surprised the LIGO authors didn’t present a proper statistical analysis of the windowed phases to prove beyond doubt the point they seem to be trying to make.

Then again, later on in the caption, there is a statement that `the phases show some clustering around the 60 Hz power line’. So, on the one hand the phases `appear random’, but on the other hand they’re not. There are other plausible clusters elsewhere too. What about them?

I’m afraid the absence of quantitative detail means I don’t find this a very edifying discussion!

 

The New Wave of Gravitational Waves

Posted in The Universe and Stuff with tags , , , , on December 4, 2018 by telescoper

I think it’s very sneaky of the LIGO Scientific Collaboration and the Virgo Collaboration to have released two new gravitational wave papers while I was out of circulation fora  couple of days, so I’m a bit late on this, but here are links to the new results on the arXiv.

You can click on all the excerpts below to make them bigger.

First there is GWTC-1: A Gravitational-Wave Transient Catalog of Compact Binary Mergers Observed by LIGO and Virgo during the First and Second Observing Runs with this abstract:

Here is a summary of the properties of the binary systems involved in the events listed in the above paper:

There are several (four) events in this catalogue that have not previously been announced (or, for that matter, subjected to peer review) despite having been seen in the data some time ago (as far back as 2015). I’m also intrigued by the footnote on the first page which contains the following:

…all candidate events with an estimated false alarm rate (FAR) less than 1 per 30 days
and probability > 0.5 of being of astrophysical origin (see Eq. (10) for the definition) are henceforth denoted with the GW prefix.

The use of false discovery rates is discussed at length here as a corrective to relying on p-values for detections. The criteria adopted here don’t seem all that strong to me.

The second paper is Binary Black Hole Population Properties Inferred from the First and Second Observing Runs of Advanced LIGO and Advanced Virgo which has this abstract:

I’ve been teaching and/or preparing lectures all day today, so I haven’t yet had time to read these papers in detail. I will try to read them over the next few days. In the meantime I would welcome comments through the box about these new results. I wonder if there’ll be any opinions from the direction of Copenhagen?

UPDATE: Here’s a montage of all 10 binary black hole mergers `detected’ so far…

I think it’s safe to say that if GW151266 had been the first to be announced, the news would have been greeted with considerable skepticism!

Gravitational Wave Controversy Updates

Posted in The Universe and Stuff with tags , , , on November 14, 2018 by telescoper

Following my recent post about the claims and counter-claims concerning the detection (or otherwise) of gravitational waves, I have a couple of updates.

First, a few days ago there appeared a paper on the arXiv by Nielsen et al with the abstract (which I’ve slightly edited for formatting reasons):

We use the Pearson cross-correlation statistic proposed by Liu & Jackson (2016), and employed by Creswell et al. (2017), to look for statistically significant correlations between the LIGO Hanford and Livingston detectors at the time of the binary black hole merger GW150914. We compute this statistic for the calibrated strain data released by LIGO, using both the residuals provided by LIGO and using our own subtraction of a maximum-likelihood waveform that is constructed to model binary black hole mergers in general relativity. To assign a significance to the values obtained, we calculate the cross-correlation of both simulated Gaussian noise and data from the LIGO detectors at times during which no detection of gravitational waves has been claimed. We find that after subtracting the maximum likelihood waveform there are no statistically significant correlations between the residuals of the two detectors at the time of GW150914.

The four authors of this paper are, I believe, either present or former members of the LIGO Collaboration

Meanwhile, the NBI group behind the Cresswell et al. paper challenged by the above paper has issued a statement which you can read here. The group re-iterate points made in the New Scientist article discussed in my recent post. Although the Nielsen et al. paper is not explicitly mentioned in the NBI statement but I’m given to understand that the Danish group does not agree with the conclusions in that paper.

The story continues.

LIGO and Open Science

Posted in Open Access, Science Politics, The Universe and Stuff with tags , , , , on August 8, 2017 by telescoper

I’ve just come from another meeting here at the Niels Bohr Institute between some members of the LIGO Scientific Collaboration and the authors of the `Danish Paper‘. As with the other one I attended last week it was both interesting and informative. I’m not going to divulge any of the details of the discussion, but I anticipate further developments that will put some of them into the public domain fairly soon and will comment on them as and when that happens.

I think an important aspect of the way science works is that when a given individual or group publishes a result, it should be possible for others to reproduce it (or not as the case may be). In normal-sized laboratory physics it suffices to explain the experimental set-up in the published paper in sufficient detail for another individual or group to build an equivalent replica experiment if they want to check the results. In `Big Science’, e.g. with LIGO or the Large Hadron Collider, it is not practically possible for other groups to build their own copy, so the best that can be done is to release the data coming from the experiment. A basic problem with reproducibility obviously arises when this does not happen.

In astrophysics and cosmology, results in scientific papers are often based on very complicated analyses of large data sets. This is also the case for gravitational wave experiments. Fortunately in astrophysics these days researchers are generally pretty good at sharing their data, but there are a few exceptions in that field. Particle physicists, by contrast, generally treat all their data as proprietary.

Even allowing open access to data doesn’t always solve the reproducibility problem. Often extensive numerical codes are needed to process the measurements and extract meaningful output. Without access to these pipeline codes it is impossible for a third party to check the path from input to output without writing their own version, assuming that there is sufficient information to do that in the first place. That researchers should publish their software as well as their results is quite a controversial suggestion, but I think it’s the best practice for science. In any case there are often intermediate stages between `raw’ data and scientific results, as well as ancillary data products of various kinds. I think these should all be made public. Doing that could well entail a great deal of effort, but I think in the long run that it is worth it.

I’m not saying that scientific collaborations should not have a proprietary period, just that this period should end when a result is announced, and that any such announcement should be accompanied by a release of the data products and software needed to subject the analysis to independent verification.

Now, if you are interested in trying to reproduce the analysis of data from the first detection of gravitational waves by LIGO, you can go here, where you can not only download the data but also find a helpful tutorial on how to analyse it.

This seems at first sight to be fully in the spirit of open science, but if you visit that page you will find this disclaimer:

 

In other words, one can’t check the LIGO data analysis because not all the data and tools necessary to do that are not publicly available.  I know for a fact that this is the case because of the meetings going on here at NBI!

Given that the detection of gravitational waves is one of the most important breakthroughs ever made in physics, I think this is a matter of considerable regret. I also find it difficult to understand the reasoning that led the LIGO consortium to think it was a good plan only to go part of the way towards open science, by releasing only part of the information needed to reproduce the processing of the LIGO signals and their subsequent statistical analysis. There may be good reasons that I know nothing about, but at the moment it seems to me to me to represent a wasted opportunity.

I know I’m an extremist when it comes to open science, and there are probably many who disagree with me, so I thought I’d do a mini-poll on this issue:

Any other comments welcome through the box below!