Neophlogistonianism

What happens when something burns?

Ask a seventeenth century scientist that question and the chances are the answer would  have involved the word phlogiston, a name derived from the Greek  φλογιστόν, meaning “burning up”. This “fiery principle” or “element” was supposed to be present in all combustible materials and the idea was that it was released into air whenever any such stuff was ignited. The act of burning separated the phlogiston from the dephlogisticated “true” form of the material, also known as calx.

The phlogiston theory held sway until  the late 18th Century, when Antoine Lavoisier demonstrated that combustion results in an increase in weight of the material being burned. This poses a serious problem if burning also involves the loss of phlogiston unless phlogiston has negative weight. However, many serious scientists of the 18th Century, such as Georg Ernst Stahl, had already suggested that phlogiston might have negative weight or, as he put it, “levity”. Nowadays we would probably say “anti-gravity”.

Eventually, Joseph Priestley discovered what actually combines with materials during combustion:  oxygen. Instead of becoming dephlogisticated, things become oxidised by fixing oxygen from air, which is why their weight increases. It’s worth mentioning, though, the name that Priestley used for oxygen was in fact “dephlogisticated air” (because it was capable of combining more extensively with phlogiston than ordinary air). He  remained a phlogistonian longer after making the discovery that should have killed the theory.

So why am I rambling on about a scientific theory that has been defunct for more than two centuries?

Well,  it’s because there just might be a lesson from history about the state of modern cosmology…

The standard cosmological model involves the hypothesis that about 75% of the energy budget of the Universe is in the form of “dark energy”. We don’t know much about what this is, except that in order to make our current understanding work out it has to act like a source of anti-gravity. It does this by violating the strong energy condition of general relativity.

Dark energy is needed to reconcile three basic measurements: (i) the brightness distant supernovae that seem to indicate the Universe is accelerating (which is where the anti-gravity comes in); (ii) the cosmic microwave background that suggests the Universe has flat spatial sections; and (iii) the direct estimates of the mass associated with galaxy clusters that accounts for about 25% of the mass needed to close the Universe.

A universe without dark energy appears not to be able to account for these three observations simultaneously within our current understanding of gravity as obtained from Einstein’s theory of general relativity.

I’ve blogged before, with some levity of my own, about how uncomfortable this dark energy makes me feel. It makes me even more uncomfortable that such an enormous  industry has grown up around it and that its existence is accepted unquestioningly by so many modern cosmologists.

Isn’t there a chance that, with the benefit of hindsight, future generations will look back on dark energy in the same way that we now see the phlogiston theory?

Or maybe the dark energy really is phlogiston. That’s got to be worth a paper! At least I prefer the name to quintessence.

About these ads

25 Responses to “Neophlogistonianism”

  1. Anton Garrett Says:

    And I thought it was called aether…
    Anton

  2. telescoper Says:

    You say aether and I say either…

  3. Anton Garrett Says:

    Let’s call the whole thing string theory!

  4. Brilliant! I had a transparency (remember talks given on transparencies?) with phlogiston, aether, and epicycles in a sketch of a scientific hypothesis graveyard. That was made in mockery of dark matter, but it translates well enough to dark energy.

  5. I agree that “quintessence” is a bad name.

    There is no evidence that the “dark energy” is anything other than the traditional cosmological constant. We should look for deviations (i.e. determine the equation of state), but as long as there is no evidence to the contrary, we should call it the cosmological constant. If another name is needed, Sean Carroll’s suggestion “smooth tension” is much better. As he points out, essentially everything has energy and lots of things are dark. “Smooth tension” describes the two main properties of the cosmological constant: it is smooth (as far as we know; we should look for deviations but there is no evidence that it is not smooth) and it has negative pressure (tension).

    Yes, the cosmological constant was a surprise for many people but, like Avogadro’s number a hundred or so years ago, what convinced people was that several independent lines of evidence all pointed in the same direction. This doesn’t mean the idea is true, or the last word, but, as always in science, the burden of proof is on the doubters: they have to come up with a better hypotheses to explain the observations, or show the observations to be wrong.

    In the world of particle physics, the default expectation is that something will happen (i.e. a particle will be produced in a collision of one observes enough collisions) unless it is forbidden. If it is forbidden, then this corresponds to a new quantum number, symmetry, conserved quantity etc and of course the burden of proof is on the one who postulates something new. Similarly, one could assume that the cosmological constant is non-zero unless there is a reason why it shouldn’t be, not vice-versa.

    No-one should be confused by the fact that Einstein first introduced the cosmological constant for the “wrong” reason and then declared it to be the biggest blunder of his life (an oft-cited quote which, as far as I know, has Gamow’s memory of it as its only source). Einstein could just have easily postulated the cosmological constant from the beginning, with a value to be determined by observations. Had he had different training as a physicist and mathematician, he might very well have. The point is that the universe is independent of the history of our attempts to discover its true nature.

  6. Anton Garrett Says:

    It is neater to suppose that a parameter, appearing in the equations expressing our theory, is zero than some value undetermined by the theory (so that it must be estimated from obervational data). Call the hypothesis that the parameter is zero ‘A’, and the hypothesis that it is nonzero ‘B’. Relative to A, hypothesis B will always fit the date better, but it pays a penalty in placing some of its prior probability for the parameter where the data subsequently indicate that the parameter is very unlikely to be. In other words, there is a trade-off between simplicity of theory and closeness of fit. This realisation of Ockham’s Razor principle has been made precise in Bayesian probability analysis. It would be good to see it applied here, so as to tell whether we really should prefer a nonzero cosmological constant.

    I suspect that flatness of the universe will fall out of a future theory as naturally as spin-1/2 particles fell out of the Dirac equation.

    There is an alternative, however. Winston Churchill used to keep a jotting pad by his bedside to capture the essence of his night thoughts, which he would otherwise forget and believed might be valuable. (I am afraid I do the same.) One night he had a really deep insight, so he jotted down its core and went happily back to sleep. Next morning he had, as usual, forgotten the insight, but knew it was on the jotter. There he found the words: “The entire universe is pervaded by a strong smell of vinegar.”

    So, not ether, but vinegar!

    Anton

    • telescoper Says:

      The cosmological parameter estimation industry does include a sizeable number of Bayesians. There are many free parameters, not just Lambda, but even with the prior penalty the cosmological constant is favoured by the data. That, however, assumes a framework based on GR which may turn out to be replaced. It seems to work as far as we understand things at the moment though. I’m not saying the inference of the existence dark energy is erroneous per se – it is more-or-less required by the data. I just think it may turn out that the whole framework needs a radical revision rather than a bit of parameter tweaking.

      As for the vinegar idea. Is that the Balsamic Cosmological Principle?

  7. Regarding the first paragraph of the last reply: Andrew Liddle wrote a paper about this about 10 years ago. Strangely, I can’t find it at ArXiv. However, what is the null hypothesis? That the cosmological constant is 0, or that the spatial curvature is 0? :-)

    Back to my original statement: is something is allowed by theory, the burden of proof is on those who claim it doesn’t exist, not vice-versa. That seems to work in experimental particle physics; why should cosmology be any different?

    Let’s go back to Copernicus and Kepler. Today, we find it perfectly natural that the eccentricity of a planetary orbit is non-zero; it is essentially a free parameter, to be determined by observations. Someone who claims it is exactly zero needs a reason. It turned out (as became clear with Newton) that there is no scientific reason. Previous reasons based on aesthetics, Aristotle etc proved to be misguided.

    One could say that it was Tycho’s observations which showed it was non-zero. True, but that is no different than the observation of a positive cosmological constant.

    I think it was Martin Rees who first pointed out (on several occasions in print and in talks) that the current obsession with “special values” of the cosmological parameters reminded him of Kepler’s preoccupation with using the platonic solids to determine, from theory as it were, the “best” distances of the planets from the sun. It turned out that his approach was wrong and today we think of planetary distances as essentially free parameters, to be determined by observation, though of course they are not completely free in that orbital resonances etc play a role. Newton’s celestial mechanics is so much better, and doesn’t have any “special values”.

  8. Peter posted a comment after I had started but before I had finished, so that should read “reply before last” instead of “last reply”.

    Some blogs, e.g. http://ideas.4brad.com/ allow comments to be posted to
    other comments, rather than just to the main post. This might a) make the discussion more clear and b) since it is then clear who is replying to whom, it wouldn’t matter if someone starts and finished one comments while someone else is working on another. (And c), since not all comments will address the same previous comment (or the original post), it is less likely that the flow will be interrupted, but according to b) this doesn’t matter if the comments are structured.)

  9. “That, however, assumes a framework based on GR which may turn out to be replaced. It seems to work as far as we understand things at the moment though. I’m not saying the inference of the existence dark energy is erroneous per se – it is more-or-less required by the data. I just think it may turn out that the whole framework needs a radical revision rather than a bit of parameter tweaking.”

    Of course, the burden of proof is on those who claim that GR isn’t correct. Of course it is probably not correct in that it is “just” a limiting case of the quantum theory of gravity, which no-one yet has, but there is no reason to believe that quantum effects have anything to do with dark matter, dark energy etc so again the burden of proof is on those who doubt GR.

    I see that one member of the MOND trinity (McGaugh, Milgrom & Sanders) has posted a comment above. Is that what you’re referring to? Since the venerable James Binney “came out” as a MOND supporter, other British astronomers need not be ashamed! :-)

  10. This (from Stacy’s pages—see above) is too good not to keep reposting indefinitely: http://www.astro.umd.edu/~ssm/mond/flowchart.html

    For the record, I was never convinced that Omega_matter was 1, was agnostic on curvature (observations now show that the curvature radius is probably quite large) and was an early champion of a non-zero cosmological constant, at least in the sense of not ruling it out without any observational evidence that it is zero (i.e. I kept it as a free parameter in my own data analyses, whether those data provided interesting constraints on it or not).

  11. Anton Garrett Says:

    Philip: The Bayesian realisation of Ockham’s Razor has quite a long history, although I don’t know its timeline in astrophysics. Peter himself, or Tom Loredo, might know that.

    I agree, obviously, that “the universe is independent of the history of our attempts to discover its true nature”. That means it is also independent of where the rhetorical burden of proof lies.

    Aesthetics is not such a bad guide, but it needs to be applied at progressively deeper levels. Circular orbits were a good bet when the underlying theory of the heavens was Platonic and the errors in the data for the innermost planets were large. Newton then overthrew Plato; his theory is more aesthetically pleasing, and it predicts elliptical not circular orbits, with arbitrary eccentricities. Observational technology improved too, so that nonzero eccentricities began to be taken seriously. Then came Einstein, with a still more aesthetically pleasing theory that turned out to explain Mercury’s perihelion precession. Don’t knock aesthetics – it is really only another word for expert judgement.

    Also, some explanation is surely in order of why the planets in our solar system have virtually circular orbits; their eccentricities are certainly not distributed uniformly. Collisions would eventually destroy anything too eccentric, and orbital resonance leads to planet-free regions. Whether that is the full explanation, I don’t know.

    Anton

  12. Minor quibble: one didn’t need better observational technology, just Tycho’s good eyesight to detect eccentricity (it was his observations of the orbit of Mars which set Kepler on the correct path).

    If there hadn’t been the preoccupation with the mystical circle, folks might have used circles as orbital approximations in the same sense that the Earth was approximated as a sphere when that was useful. Not even Aristotle claimed the Earth is a perfect sphere, but he maintained that the orbits of the planets are perfect circles (carried along by crystalline spheres). Even when it WAS clear that this didn’t work, epicycles came along, which were much more complicated but were still circular in some sense.

    One can think of epicycles as Fourier synthesis. As long as one doesn’t know or care about the distances of the planets, enough epicycles can fit the observations arbitrarily accurately. They are ruled out on aesthetic grounds (if one has the write aesthetics) and by measuring the distances to the planets.

    As a side note: one often reads that Tycho’s system was a compromise because he couldn’t fully break from tradition. Leaving aside the fact that someone who in his day and age didn’t marry the mother of his many children probably didn’t care much for tradition, the real reason was that he expected to observe aberration in the case of a moving Earth. He didn’t. The reason was that he didn’t know wave optics, and we can’t fault him for that. (Based on the apparent diameter of about an arc minute, he assumed that stars are about 30 times as far away as the sun.) Actually, his system is quite radical, since it won’t work if the planets are attached to crystalline spheres.

  13. Anton Garrett Says:

    Agreed Philip, although too many people suppose that only the State can declare people married, whereas originally it was the couple who declared themselves married (understood to mean a relationship that was permanent, intimate and exclusive) and informed the authorities. So I’d like to know whether Tycho and his lady considered themselves married. (Ditto George Green and lady.)
    Anton

  14. Tycho was a nobleman; his girlfriend was not. Thus, marrying a commoner was a no-no. This is still an issue in the UK today, though other monarchies have become more modern recently.

    As a nobleman, in those days even being associated with a commoner was rather radical, whatever the legal status of their relationship.

    Wikipedia has a good summary:

    Family life

    In 1572, in Knudstrup, Tycho fell in love with Kirsten, daughter of Jørgen Hansen, the Lutheran priest in Knudstrup. She was a commoner, and Tycho never formally married her. However, under Danish law, when a nobleman and a common woman lived together openly as husband and wife, and she wore the keys to the household at her belt like any true wife, their alliance became a binding morganatic marriage after three years. The husband retained his noble status and privileges; the wife remained a commoner. Their children were legitimate in the eyes of the law, but they were commoners like their mother and could not inherit their father’s name, coat of arms, or landholdings. (Skautrup 1941, pp. 24-5)

    Kirsten Jørgensdatter gave birth to their first daughter, Kirstine (named after Tycho’s late sister, who died at 13) on 12 October 1573. Together they had eight children, six of whom lived to adulthood. In 1574, they moved to Copenhagen where their daughter Magdalene was born. Kirsten and Tycho lived together for almost thirty years until Tycho’s death.

    • telescoper Says:

      Phil: His girlfriend wasn’t a nobleman?

      Anton: I was given to understand that it was a requirement of Cambridge fellows to be “celibate”, meaning that they couldn’t be married but not that they had to abstain from “it”. Green didn’t go to Cambridge until 1833 and the first of the seven children he had with Jane Smith was born in 1824 (the last in 1840). Mary Cannell told me that she thought the couple agreed not to wed early on because of George’s academic ambitions, but I don’t know enough about social history to know how unusual it was for folk of their ilk to have families without formally tying the knot.

  15. Thomas D Says:

    This post brings the Chaplygin gas to my mind (metaphorically..) – how many people who invoke it for dark energy actually know its original physical motivation? Which is about as far from empty space as it is possible to get.

    • telescoper Says:

      It was something to do with transonic flows wasn’t it?
      Not that I’m an expert on Charlie Chaplygin.

  16. Anton Garrett Says:

    Peter,

    I don’t know what the Cambridge statute actually said, but it would be presumed (rightly or wrongly) that single men were celibate and married men were not. When it was revoked in the late 19th century a large number of dons undertook Anglican wedding ceremonies with their women, many of whom who lived at Ely – just far enough from Great St Marys to be out of reach of the proctors who actually had powers to arbitrarily lock up women they suspected were liable to bring the gentlemen of the university low. By the late 19th century this was a legal anomaly, and the locking up on suspicion of a respectably married woman triggered overdue reform.

    I decline to call these women ‘mistresses’ or say that they ‘got married’ once the statutes changed, because I dispute that only the State (or the State-recognised church) can declare a couple married. A couple can declare themselves married – even if the authorities do not recognise it. Their mutual pladges should include exclusivity, intimacy and permanency, and they should inform the authorities (whatever the authorities make of it). Only after 1563 did the Roman Catholic church decline to recognise such declarations, for instance, and recognise only Catholic church weddings.

    I expect many of those Cambridge couples had thought this issue out, and were fully prepared to argue it from the Bible, since nowhere in either Testament is a third party needed to legitimise a marriage.

    Anton

  17. Andrew Liddle Says:

    Picking up from way up the thread, what Anton is referring to is usually referred to by cosmologists as Bayesian model selection or comparison, meaning that the data gets to choose not just the values of model parameters, but what set of parameters ought to be varied in fitting data. Indeed this automatically imposes Ockham’s razor (specifically, it rewards predictive models, provided that they were successful in predicting future data, rather than pure simplicity, but of course simple models tend to be the most predictive).

    To my knowledge, the first application of Bayesian model comparison in cosmology was by Andrew Jaffe in 1996. The Cavendish guys have been at it for a long time; I and my Sussex chums have spent quite a bit of time on it more recently.

    The Lambda=0 model does get ruled out convincingly, for any reasonable prior, due to the strength of the current data. In fact almost every model comparison calculation we have ever done, many involving months of CPU time, has ended up giving an inconclusive result and this is the rare exception.

    In a strict Bayesian approach there is no such thing as a `null hypothesis’ to be ruled out; instead there is a collection of models treated on an equal basis. Lambda=0 does badly and pure cosmological constant does best, but many other dark energy or modified gravity models do not do badly enough to be ruled out.

    Nevertheless Bayesianism doesn’t really lead to a definitive conclusion, because all it can do is select the best model amongst those models that you have. Whereas Peter’s discomfort is because he suspects there may be a much better model that we haven’t thought of. At least not yet.

    Andrew

  18. Anton Garrett Says:

    Hi Andrew, long time no see! There are two realisations of Ockham’s qualitative razor principle in Bayesian analysis. One is referred to in my 0903 post on May 19th; the other was introduced, to my knowledge, by Dave MacKay. Generally it is the latter which is referred to as Bayesian model comparison. I *think* you use the phrase in the same way as Dave, although I’d need to see some equations to be sure.
    All the best
    Anton

  19. Thomas D Says:

    The recent ‘interest’ in ‘Chaplygin gas’ seems to stem from Kamenshchik et al and Jackiw and Polychronakos who were looking at D-branes and supersymmetric fluid mechanics around 2000. Jackiw doesn’t seem to have given any particular reference to justify the nickname.

    Then someone thought that, no matter what the physical derivation and meaning – or lack of it – it would be fun to solve the FRW equations with an added ‘Chaplygin gas’. The rest is history, as they say. (They also say that ‘history is bunk’.)

    • telescoper Says:

      We had a look at the Chaplygin equation of state in one of the cosmology discussion group meetings where we discussed ArXiv: 0811.2797.

      I think it’s not as good as the phlogiston theory.

  20. Typo: Yes, it should have said “noblewoman” not “nobleman”.

    Wasn’t it only in the 19th century that Oxford dons were allowed to marry?

    IIRC, this tradition stems from the fact that originally most or all university graduates were priests and priests couldn’t marry.

  21. Anton Garrett Says:

    Philip: see my previous posts on this thread for what happened at Cambridge. I would expect Oxford to be similar; perhaps they saw the problem that arose in Cambridge and changed in time to prevent a repeat.

    In protestant England ordained people COULD marry. It would be an interesting historical exercise to see how this facet of the Reformation failed to reach Oxbridge colleges.

    Anton

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 3,269 other followers

%d bloggers like this: