Archive for Particle Physics

The 3.5 keV “Line” that (probably) wasn’t…

Posted in Bad Statistics, The Universe and Stuff with tags , , , , , , , on July 26, 2016 by telescoper

About a year ago I wrote a blog post about a mysterious “line” in the X-ray spectra of galaxy clusters corresponding to an energy of around 3.5 keV. The primary reference for the claim is a paper by Bulbul et al which is, of course, freely available on the arXiv.

The key graph from that paper is this:

XMMspectrum

The claimed feature – it stretches the imagination considerably to call it a “line” – is shown in red. No, I’m not particularly impressed either, but this is what passes for high-quality data in X-ray astronomy!

Anyway, there has just appeared on the arXiv a paper by the Hitomi Collaboration describing what are basically the only set of science results that the Hitomi satellite managed to obtain before it fell to bits earlier this year. These were observations of the Perseus Cluster.

Here is the abstract:

High-resolution X-ray spectroscopy with Hitomi was expected to resolve the origin of the faint unidentified E=3.5 keV emission line reported in several low-resolution studies of various massive systems, such as galaxies and clusters, including the Perseus cluster. We have analyzed the Hitomi first-light observation of the Perseus cluster. The emission line expected for Perseus based on the XMM-Newton signal from the large cluster sample under the dark matter decay scenario is too faint to be detectable in the Hitomi data. However, the previously reported 3.5 keV flux from Perseus was anomalously high compared to the sample-based prediction. We find no unidentified line at the reported flux level. The high flux derived with XMM MOS for the Perseus region covered by Hitomi is excluded at >3-sigma within the energy confidence interval of the most constraining previous study. If XMM measurement uncertainties for this region are included, the inconsistency with Hitomi is at a 99% significance for a broad dark-matter line and at 99.7% for a narrow line from the gas. We do find a hint of a broad excess near the energies of high-n transitions of Sxvi (E=3.44 keV rest-frame) – a possible signature of charge exchange in the molecular nebula and one of the proposed explanations for the 3.5 keV line. While its energy is consistent with XMM pn detections, it is unlikely to explain the MOS signal. A confirmation of this interesting feature has to wait for a more sensitive observation with a future calorimeter experiment.

And here is the killer plot:

Perseus_Hitomi

The spectrum looks amazingly detailed, which makes the demise of Hitomi all the more tragic, but the 3.5 keV is conspicuous by its absence. So there you are, yet another supposedly significant feature that excited a huge amount of interest turns out to be nothing of the sort. To be fair, as the abstract states, the anomalous line was only seen by stacking spectra of different clusters and might still be there but too faint to be seen in an individual cluster spectrum. Nevertheless I’d say the probability of there being any feature at 3.5 keV has decreased significantly after this observation.

P.S. rumours suggest that the 750 GeV diphoton “excess” found at the Large Hadron Collider may be about to meet a similar fate.

The Dark Energy MacGuffin

Posted in Science Politics, The Universe and Stuff with tags , , , , , , , , on December 19, 2015 by telescoper

Back from a two-day meeting in Edinburgh about the Euclid Mission, I have to spend a couple of days this weekend in the office before leaving for the holidays. I was a bit surprised at the end of the meeting to be asked if I would be on the panel for the closing discussion, discussing questions raised by the audience. The first of these questions was – and I have to paraphrase becase I don’t remember exactly – whether it would be disappointing if the Euclid mission merely confirmed that observations were consistent with a “simple” cosmological constant rather than any of the more exotic (and perhaps more exciting) alternatives that have been proposed by theorists. I think that’s the likely outcome of Euclid, actually, and I don’t think it would be disappointing if it turned out to be the case. Moreover, testing theories of dark energy is just one of the tasks this mission will undertake and it may well be the case that in years to come Euclid is remembered for something other than dark energy. Anyway, this all triggered a memory of an old post of mine about Alfred Hitchcock so with apologies for repeating something I blogged about 4 years ago, here is a slight reworking of an old piece.

–0–

Unpick the plot of any thriller or suspense movie and the chances are that somewhere within it you will find lurking at least one MacGuffin. This might be a tangible thing, such the eponymous sculpture of a Falcon in the archetypal noir classic The Maltese Falcon or it may be rather nebulous, like the “top secret plans” in Hitchcock’s The Thirty Nine Steps. Its true character may be never fully revealed, such as in the case of the glowing contents of the briefcase in Pulp Fiction , which is a classic example of the “undisclosed object” type of MacGuffin, or it may be scarily obvious, like a doomsday machine or some other “Big Dumb Object” you might find in a science fiction thriller. It may even not be a real thing at all. It could be an event or an idea or even something that doesn’t exist in any real sense at all, such the fictitious decoy character George Kaplan in North by Northwest. In fact North by North West is an example of a movie with more than one MacGuffin. Its convoluted plot involves espionage and the smuggling of what is only cursorily described as “government secrets”. These are the main MacGuffin; George Kaplan is a sort of sub-MacGuffin. But although this is behind the whole story, it is the emerging romance, accidental betrayal and frantic rescue involving the lead characters played by Cary Grant and Eve Marie Saint that really engages the characters and the audience as the film gathers pace. The MacGuffin is a trigger, but it soon fades into the background as other factors take over.

Whatever it is or is not, the MacGuffin is responsible for kick-starting the plot. It makes the characters embark upon the course of action they take as the tale begins to unfold. This plot device was particularly beloved by Alfred Hitchcock (who was responsible for introducing the word to the film industry). Hitchcock was however always at pains to ensure that the MacGuffin never played as an important a role in the mind of the audience as it did for the protagonists. As the plot twists and turns – as it usually does in such films – and its own momentum carries the story forward, the importance of the MacGuffin tends to fade, and by the end we have usually often forgotten all about it. Hitchcock’s movies rarely bother to explain their MacGuffin(s) in much detail and they often confuse the issue even further by mixing genuine MacGuffins with mere red herrings.

Here is the man himself explaining the concept at the beginning of this clip. (The rest of the interview is also enjoyable, convering such diverse topics as laxatives, ravens and nudity..)

 

There’s nothing particular new about the idea of a MacGuffin. I suppose the ultimate example is the Holy Grail in the tales of King Arthur and the Knights of the Round Table and, much more recently, the Da Vinci Code. The original Grail itself is basically a peg on which to hang a series of otherwise disconnected stories. It is barely mentioned once each individual story has started and, of course, is never found.

Physicists are fond of describing things as “The Holy Grail” of their subject, such as the Higgs Boson or gravitational waves. This always seemed to me to be an unfortunate description, as the Grail quest consumed a huge amount of resources in a predictably fruitless hunt for something whose significance could be seen to be dubious at the outset.The MacGuffin Effect nevertheless continues to reveal itself in science, although in different forms to those found in Hollywood.

The Large Hadron Collider (LHC), switched on to the accompaniment of great fanfares a few years ago, provides a nice example of how the MacGuffin actually works pretty much backwards in the world of Big Science. To the public, the LHC was built to detect the Higgs Boson, a hypothetical beastie introduced to account for the masses of other particles. If it exists the high-energy collisions engineered by LHC should reveal its presence. The Higgs Boson is thus the LHC’s own MacGuffin. Or at least it would be if it were really the reason why LHC has been built. In fact there are dozens of experiments at CERN and many of them have very different motivations from the quest for the Higgs, such as evidence for supersymmetry.

Particle physicists are not daft, however, and they have realised that the public and, perhaps more importantly, government funding agencies need to have a really big hook to hang such a big bag of money on. Hence the emergence of the Higgs as a sort of master MacGuffin, concocted specifically for public consumption, which is much more effective politically than the plethora of mini-MacGuffins which, to be honest, would be a fairer description of the real state of affairs.

Even this MacGuffin has its problems, though. The Higgs mechanism is notoriously difficult to explain to the public, so some have resorted to a less specific but more misleading version: “The Big Bang”. As I’ve already griped, the LHC will never generate energies anything like the Big Bang did, so I don’t have any time for the language of the “Big Bang Machine”, even as a MacGuffin.

While particle physicists might pretend to be doing cosmology, we astrophysicists have to contend with MacGuffins of our own. One of the most important discoveries we have made about the Universe in the last decade is that its expansion seems to be accelerating. Since gravity usually tugs on things and makes them slow down, the only explanation that we’ve thought of for this perverse situation is that there is something out there in empty space that pushes rather than pulls. This has various possible names, but Dark Energy is probably the most popular, adding an appropriately noirish edge to this particular MacGuffin. It has even taken over in prominence from its much older relative, Dark Matter, although that one is still very much around.

We have very little idea what Dark Energy is, where it comes from, or how it relates to other forms of energy we are more familiar with, so observational astronomers have jumped in with various grandiose strategies to find out more about it. This has spawned a booming industry in surveys of the distant Universe (such as the Dark Energy Survey or the Euclid mission I mentioned in the preamble) all aimed ostensibly at unravelling the mystery of the Dark Energy. It seems that to get any funding at all for cosmology these days you have to sprinkle the phrase “Dark Energy” liberally throughout your grant applications.

The old-fashioned “observational” way of doing astronomy – by looking at things hard enough until something exciting appears (which it does with surprising regularity) – has been replaced by a more “experimental” approach, more like that of the LHC. We can no longer do deep surveys of galaxies to find out what’s out there. We have to do it “to constrain models of Dark Energy”. This is just one example of the not necessarily positive influence that particle physics has had on astronomy in recent times and it has been criticised very forcefully by Simon White.

Whatever the motivation for doing these projects now, they will undoubtedly lead to new discoveries. But my own view is that there will never be a solution of the Dark Energy problem until it is understood much better at a conceptual level, and that will probably mean major revisions of our theories of both gravity and matter. I venture to speculate that in twenty years or so people will look back on the obsession with Dark Energy with some amusement, as our theoretical language will have moved on sufficiently to make it seem irrelevant.

But that’s how it goes with MacGuffins. Even the Maltese Falcon turned out in the end to be a fake.

A Bump at the Large Hadron Collider

Posted in Bad Statistics, The Universe and Stuff with tags , , , on December 16, 2015 by telescoper

Very busy, so just a quickie today. Yesterday the good folk at the Large Hadron Collider announced their latest batch of results. You can find the complete set from the CMS experiment here and from ATLAS here.

The result that everyone is talking about is shown in the following graph, which shows the number of diphoton events as a function of energy:

Atlas_Bump

Attention is focussing on the apparent “bump” at around 750 GeV; you can find an expert summary by a proper particle physicist here and another one here.

It is claimed that the “significance level” of this “detection” is 3.6σ. I won’t comment on that precise statement partly because it depends on the background signal being well understood but mainly because I don’t think this is the right language in which to express such a result in the first place. Experimental particle physicists do seem to be averse to doing proper Bayesian analyses of their data.

However if you take the claim in the way such things are usually presented it is roughly equivalent to a statement that the odds against this being a real detection are greater that 6000:1. If any particle physicists out there are willing to wager £6000 for £1 of mine that this result will be confirmed by future measurements then I’d happily take them up on that bet!

P.S. Entirely predictably there are 10 theory papers on today’s ArXiv offering explanations of the alleged bump, none of which says that it’s a noise feature..

 

 

The Nobel Prize for Neutrino Oscillations

Posted in The Universe and Stuff with tags , , , , , , , , on October 6, 2015 by telescoper

Well the Nobel Prize for Physics in 2015 has been announced. It has been awarded jointly to Takaaki Kajita and Arthur B. McDonald for..

the discovery of neutrino oscillations, which prove that neutrinos have mass.

You can read the full citation here. Congratulations to them both. Some physicists around here were caught by surprise because the 2002 Nobel Prize was also awarded for neutrino physics, but it is fair because this award goes for a direct measurement of neutrino oscillations, which is an important breakthrough in its own right; the earlier award was for measurements of solar neutrinos. For a nice description of the background you could do worse than the Grauniad blog post by Jon Butterworth about neutrino physics.

In brief the a process in which neutrinos (which have three distinct flavour states, associated with the electron, mu and tau leptons) can change flavour as they propagate. It’s quite a weird thing to spring on students who previously thought that lepton number (which denotes the flavour) was always conserved. I remember years ago having to explain this phenomenon to third-year students taking my particle physics course.  I decided to start with an analogy based on more familiar physics, but it didn’t go to plan.

A charged fermion such as an electron (or in fact anything that has a magnetic moment, which would include, e.g. the neutron)  has spin and, according to standard quantum mechanics, the component of this in any direction can  can be described in terms of two basis states, say |\uparrow> and |\downarrow> for spin in the z direction. In general, however, the spin state will be a superposition of these, e.g.

\frac{1}{\sqrt{2}} \left( |\uparrow> + |\downarrow>\right)

In this example, as long as the particle is travelling through empty space, the probability of finding it with spin “up” is  50%, as is the probability of finding it in the spin “down” state. Once a measurement is made, the state collapses into a definite “up” or “down” wherein it remains until something else is done to it.

If, on the other hand, the particle  is travelling through a region where there is a  magnetic field the “spin-up” and “spin-down” states can acquire different energies owing to the interaction between the spin and the magnetic field. This is important because it means the bits of the wave function describing the up and down states evolve at different rates, and this  has measurable consequences: measurements made at different positions yield different probabilities of finding the spin pointing in different directions. In effect, the spin vector of the  particle performs  a sort of oscillation, similar to the classical phenomenon called  precession.

The mathematical description of neutrino oscillations is very similar to this, except it’s not the spin part of the wavefunction being affected by an external field that breaks the symmetry between “up” and “down”. Instead the flavour part of the wavefunction is “precessing” because the flavour states don’t coincide with the eigenstates of the Hamiltonian that describes the neutrinos’ evolution. However, it does require that different neutrino types have intrinsically different energies  in quite  a similar way similar to the spin-precession example. In the context of neutrinos however the difference in energy means a difference in mass, and if there’s a difference in mass then not all flavours of neutrino can be massless.

Although the analogy I used isn’t a perfect, I thought  it was a good way of getting across the basic idea. Unfortunately, however, when I subsequently asked an examination question about neutrino oscillations I got a significant number of answers that said “neutrino oscillations happen when a neutrino travels through a magnetic field….”. Sigh. Neutrinos don’t interact with  magnetic fields, you see…

Anyway, today’s announcment also prompts me to mention that neutrino physics is one of the main research interests in our Experimental Particle Physics group here at Sussex. You can read a recent post here about an important milestone in the development of the NOvA Experiment which involves several members of the Department of Physics and Astronomy in the School of Mathematical and Physical Sciences here at the University of Sussex. Here’s the University of Sussex’s press release on the subject. In fact Art McDonald is a current collaborator of our neutrino physicists, who have been celebrating his award today!

Neutrino physics is a fascinating subject even to someone like me, who isn’t really a particle physicist. My impression of the field is that was fairly moribund until about the turn of the millennium  when the first measurement of atmospheric neutrino oscillations was announced. All of a sudden there was evidence that neutrinos can’t all be massless (as many of us had long assumed, at least as far as lecturing was concerned).  Now the humble neutrino is the subject of intense experimental activity, not only in the USA and UK but all around the world in a way that would have been difficult to predict twenty years ago.

But then, as the physicist Niels Bohr famously observed, “Prediction is very difficult. Especially about the future.”

An Open Letter to the Times Higher World University Rankers

Posted in Education, The Universe and Stuff with tags , , , , , , , , on October 5, 2015 by telescoper

Dear Rankers,

Having perused your latest set of league tables along with the published methodology, a couple of things puzzle me.

First, I note that you have made significant changes to your methodology for combining metrics this year. How, then, can you justify making statements such as

US continues to lose its grip as institutions in Europe up their game

when it appears that any changes could well be explained not by changes in performance, as gauged by the metrics you use,  but in the way they are combined?

I assume, as intelligent and responsible people, that you did the obvious test for this effect, i.e. to construct a parallel set of league tables, with this year’s input data but last year’s methodology, which would make it easy to isolate changes in methodology from changes in the performance indicators.  Your failure to publish such a set, to illustrate how seriously your readers should take statements such as that quoted above, must then simply have been an oversight. Had you deliberately witheld evidence of the unreliability of your conclusions you would have left yourselves open to an accusation of gross dishonesty, which I am sure would be unfair.

Happily, however, there is a very easy way to allay the fears of the global university community that the world rankings are being manipulated: all you need to do is publish a set of league tables using the 2014 methodology and the 2015 data. Any difference between this table and the one you published would then simply be an artefact and the new ranking can be ignored. I’m sure you are as anxious as anyone else to prove that the changes this year are not simply artificially-induced “churn”, and I look forward to seeing the results of this straightforward calculation published in the Times Higher as soon as possible.

Second, I notice that one of the changes to your methodology is explained thus

This year we have removed the very small number of papers (649) with more than 1,000 authors from the citations indicator.

You are presumably aware that this primarily affects papers relating to experimental particle physics, which is mostly conducted through large international collaborations (chiefly, but not exclusively, based at CERN). This change at a stroke renders such fundamental scientific breakthroughs as the discovery of the Higgs Boson completely worthless. This is a strange thing to do because this is exactly the type of research that inspires  prospective students to study physics, as well as being direct measures in themselves of the global standing of a University.

My current institution, the University of Sussex, is heavily involved in experiments at CERN. For example, Dr Iacopo Vivarelli has just been appointed coordinator of all supersymmetry searches using the ATLAS experiment on the Large Hadron Collider. This involvement demonstrates the international standing of our excellent Experimental Particle Physics group, but if evidence of supersymmetry is found at the LHC your methodology will simply ignore it. A similar fate will also befall any experiment that requires large international collaborations: searches for dark matter, dark energy, and gravitational waves to name but three, all exciting and inspiring scientific adventures that you regard as unworthy of any recognition at all but which draw students in large numbers into participating departments.

Your decision to downgrade collaborative research to zero is not only strange but also extremely dangerous, for it tells university managers that participating in world-leading collaborative research will jeopardise their rankings. How can you justify such a deliberate and premeditated attack on collaborative science? Surely it is exactly the sort of thing you should be rewarding? Physics departments not participating in such research are the ones that should be downgraded!

Your answer might be that excluding “superpapers” only damages the rankings of smaller universities because might owe a larger fraction of their total citation count to collaborative work. Well, so what if this is true? It’s not a reason for excluding them. Perhaps small universities are better anyway, especially when they emphasize small group teaching and provide opportunities for students to engage in learning that’s led by cutting-edge research. Or perhaps you have decided otherwise and have changed your methodology to confirm your prejudice…

I look forward to seeing your answers to the above questions through the comments box or elsewhere – though you have ignored my several attempts to raise these questions via social media. I also look forward to seeing you correct your error of omission by demonstrating – by the means described above – what  changes in league table positions are by your design rather than any change in performance. If it turns out that the former is the case, as I think it will, at least your own journal provides you with a platform from which you can apologize to the global academic community for wasting their time.

Yours sincerely,

Telescoper

“Credit” needn’t mean “Authorship”

Posted in Science Politics, The Universe and Stuff with tags , , , on September 4, 2015 by telescoper

I’ve already posted about the absurdity of scientific papers with ridiculously long author lists but this issue has recently come alive again with the revelation that the compilers of the Times Higher World University Rankings decided to exclude such papers entirely from their analysis of citation statistics.

Large collaborations involving not only scientists but engineers, instrument builders, computer programmers and data analysts –  are the norm in some fields of science – especially (but not exclusively) experimental particle physics – so the arbitrary decision to omit such works from bibliometric analysis is not only idiotic but also potentially damaging to a number of disciplines. The “logic” behind this decision is that papers with “freakish” author lists might distort analyses of citation impact, even allowing – heaven forbid – small institutions with a strong involvement in world-leading studies such as those associated with the Large Hadron Collider to do well compared with larger institutions that are not involved in such collaborations.  If what you do doesn’t fit comfortably within a narrow and simplistic method of evaluating research, then it must be excluded even if it is the best in the world. A sensible person would realise that if the method doesn’t give proper credit then you need a better method, but the bean counters at the Times Higher have decided to give no credit at all to research conducted in this way. The consequences of putting the bibliometric cart in front of the scientific horse could be disastrous, as insitutions find their involvement in international collaborations dragging them down the league tables. I despair of the obsession with league tables because these rankings involve trying to shoehorn a huge amount of complicated information into a single figure of merit. This is not only pointless, but could also drive behaviours that are destructive to entire disciplines.

That said, there is no denying that particle physicists, cosmology and other disciplines that operate through large teams must share part of the blame. Those involved in these collaborations have achieved brilliant successes through the imagination and resourcefulness of the people involved. Where imagination has failed however is to carry on insisting that the only way to give credit to members of a consortium is by making them all authors of scientific papers. In the example I blogged about a few months ago this blinkered approach generated a paper with more than 5000 authors; of the 33 pages in the article, no fewer than 24 were taken up with the list of authors.

Papers just don’t have five thousand “authors”. I even suspect that only about 1% of these “authors” have even read the paper. That doesn’t mean that the other 99% didn’t do immensely valuable work. It does mean that pretending that they participated in writing the article that describes their work isn’t be the right way to acknowledge their contribution. How are young scientists supposed to carve out a reputation if their name is always buried in immensely long author lists? The very system that attempts to give them credit renders that credit worthless. Instead of looking at publication lists, appointment panels have to rely on reference letters instead and that means early career researchers have to rely on the power of patronage.

As science evolves it is extremely important that the methods for disseminating scientific results evolve too. The trouble is that they aren’t. We remain obsessed with archaic modes of publication, partly because of innate conservatism and partly because the lucrative publishing industry benefits from the status quo. The system is clearly broken, but the scientific community carries on regardless. When there are so many brilliant minds engaged in this sort of research, why are so few willing to challenge an orthodoxy that has long outlived its usefulness. Change is needed, not to make life simpler for the compilers of league tables, but for the sake of science itself.

I’m not sure what is to be done, but it’s an urgent problem which looks set to develop very rapidly into an emergency. One idea appears in a paper on the arXiv with the abstract:

Science and engineering research increasingly relies on activities that facilitate research but are not currently rewarded or recognized, such as: data sharing; developing common data resources, software and methodologies; and annotating data and publications. To promote and advance these activities, we must develop mechanisms for assigning credit, facilitate the appropriate attribution of research outcomes, devise incentives for activities that facilitate research, and allocate funds to maximize return on investment. In this article, we focus on addressing the issue of assigning credit for both direct and indirect contributions, specifically by using JSON-LD to implement a prototype transitive credit system.

I strongly recommend this piece. I don’t think it offers a complete solution, but certainly contains  many interesting ideas. For the situation to improve, however, we have to accept that there is a problem. As things stand, far too many senior scientists are in denial. This has to change.

 

 

The Curious Case of the 3.5 keV “Line” in Cluster Spectra

Posted in Bad Statistics, The Universe and Stuff with tags , , , , , , on July 22, 2015 by telescoper

Earlier this week I went to a seminar. That’s a rare enough event these days given all the other things I have to do. The talk concerned was by Katie Mack, who was visiting the Astronomy Centre and it contained a nice review of the general situation regarding the constraints on astrophysical dark matter from direct and indirect detection experiments. I’m not an expert on experiments – I’m banned from most laboratories on safety grounds – so it was nice to get a review from someone who knows what they’re talking about.

One of the pieces of evidence discussed in the talk was something I’ve never really looked at in detail myself, namely the claimed evidence of an  emission “line” in the spectrum of X-rays emitted by the hot gas in galaxy clusters. I put the word “line” in inverted commas for reasons which will soon become obvious. The primary reference for the claim is a paper by Bulbul et al which is, of course, freely available on the arXiv.

The key graph from that paper is this:

XMMspectrum

The claimed feature – it stretches the imagination considerably to call it a “line” – is shown in red. No, I’m not particularly impressed either, but this is what passes for high-quality data in X-ray astronomy!

There’s a nice review of this from about a year ago here which says this feature

 is very significant, at 4-5 astrophysical sigma.

I’m not sure how to convert astrophysical sigma into actual sigma, but then I don’t really like sigma anyway. A proper Bayesian model comparison is really needed here. If it is a real feature then a plausible explanation is that it is produced by the decay of some sort of dark matter particle in a manner that involves the radiation of an energetic photon. An example is the decay of a massive sterile neutrino – a hypothetical particle that does not participate in weak interactions –  into a lighter standard model neutrino and a photon, as discussed here. In this scenario the parent particle would have a mass of about 7keV so that the resulting photon has an energy of half that. Such a particle would constitute warm dark matter.

On the other hand, that all depends on you being convinced that there is anything there at all other than a combination of noise and systematics. I urge you to read the paper and decide. Then perhaps you can try to persuade me, because I’m not at all sure. The X-ray spectrum of hot gas does have a number of known emission features in it that needed to be subtracted before any anomalous emission can be isolated. I will remark however that there is a known recombination line of Argon that lies at 3.6 keV, and you have to be convinced that this has been subtracted correctly if the red bump is to be interpreted as something extra. Also note that all the spectra that show this feature are obtained using the same instrument – on the XMM/Newton spacecraft which makes it harder to eliminate the possibility that it is an instrumental artefact.

I’d be interested in comments from X-ray folk about how confident we should be that the 3.5 keV “anomaly” is real…