Archive for accelerating universe

A Dark Energy Mission

Posted in The Universe and Stuff with tags , , on November 16, 2013 by telescoper

Here’s a challenge for cosmologists and aspiring science communicators out there. Most of you will know the standard cosmological model involves a thing, called Dark Energy, whose existence is inferred from observations that suggest that the expansion of the Universe appears to be accelerating.

That these observations require something a bit weird can be quickly seen by looking at the equation that governs the dynamics of the cosmic scale factor R for a simple model involving matter in the form of a perfect fluid:

\ddot{R}=-\frac{4\pi G}{3} \left( \rho + \frac{3p}{c^2}\right) R

The terms in brackets relate to the density and pressure of the fluid, respectively. If the pressure is negligible (as is the case for “dust”), then the expansion is always decelerating because the density of matter is always positive quantity; we don’t know of anything that has a negative mass.

The only way to make the expansion of such a universe actually accelerate is to fill it with some sort of stuff that has

\left( \rho + \frac{3p}{c^2} \right) < 0.

In the lingo this means that the strong energy condition must be violated; this is what the hypothetical dark energy component is introduced to do. Note that this requires the dark energy to exert negative pressure, ie it has to be, in some sense, in tension.

However, there’s something about this that seems very paradoxical. Pressure generates a force that pushes, tension corresponds to a force that pulls. In the cosmological setting, though, increasing positive pressure causes a greater deceleration while to make the universe accelerate requires tension. Why should a bigger pushing force cause the universe to slow down, while a pull causes it to speed up?

The lazy answer is to point at the equation and say “that’s what the mathematics says”, but that’s no use at all when you want to explain this to Joe Public.

Your mission, should you choose to accept it, is to explain in language appropriate to a non-expert, why a pull seems to cause a push…

Your attempts through the comments box please!

Another Nobel Prize for Cosmology!

Posted in The Universe and Stuff with tags , , , , , , , on October 4, 2011 by telescoper

Just time in between teaching and meetings for a quick post on today’s announcement that the 2011 Nobel Prize for Physics has gone to Saul Perlmutter, Brian P. Schmidt and Adam G. Riess “for the discovery of the accelerating expansion of the Universe through observations of distant supernovae.”

I’ve taken the liberty of copying the following text from the press release on the Nobel Foundation website

In 1998, cosmology was shaken at its foundations as two research teams presented their findings. Headed by Saul Perlmutter, one of the teams had set to work in 1988. Brian Schmidt headed another team, launched at the end of 1994, where Adam Riess was to play a crucial role.

The research teams raced to map the Universe by locating the most distant supernovae. More sophisticated telescopes on the ground and in space, as well as more powerful computers and new digital imaging sensors (CCD, Nobel Prize in Physics in 2009), opened the possibility in the 1990s to add more pieces to the cosmological puzzle.

The teams used a particular kind of supernova, called type Ia supernova. It is an explosion of an old compact star that is as heavy as the Sun but as small as the Earth. A single such supernova can emit as much light as a whole galaxy. All in all, the two research teams found over 50 distant supernovae whose light was weaker than expected – this was a sign that the expansion of the Universe was accelerating. The potential pitfalls had been numerous, and the scientists found reassurance in the fact that both groups had reached the same astonishing conclusion.

For almost a century, the Universe has been known to be expanding as a consequence of the Big Bang about 14 billion years ago. However, the discovery that this expansion is accelerating is astounding. If the expansion will continue to speed up the Universe will end in ice.

The acceleration is thought to be driven by dark energy, but what that dark energy is remains an enigma – perhaps the greatest in physics today. What is known is that dark energy constitutes about three quarters of the Universe. Therefore the findings of the 2011 Nobel Laureates in Physics have helped to unveil a Universe that to a large extent is unknown to science. And everything is possible again.

I’m definitely among the skeptics when it comes to the standard interpretation of the supernova measurements, and more recent complementary data, in terms of dark energy. However this doesn’t diminish in any way my delight that these three scientists have been rewarded for their sterling observational efforts. The two groups involved in the Supernova Cosmology Project on the one hand, and the High Z Supernova Search, on the other, are both supreme examples of excellence in observational astronomy, taking on and overcoming what were previously thought to be insurmountable observational challenges. This award has been in the air for a few years now, and I’m delighted for all three scientists that their time has come at last. To my mind their discovery is all the more exciting because nobody really knows precisely what it is that they have discovered!

I know that Brian Schmidt is an occasional reader and commenter on this blog. I suspect he might be a little busy right now with the rest of the world’s media right to read this, let alone comment on here, but that won’t stop me congratulating him and the other winners on their achievement. I’m sure they’ll enjoy their visit to Stockholm!

Meanwhile the rest of us can bask in their reflected glory. There’s also been a huge amount of press interest in this announcement which has kept my phone ringing this morning. It’s only been five years since a Nobel Prize in physics went to cosmology, which says something for how exciting a field this is to work in!

UPDATE: There’s an interesting collection of quotes and reactions on the Guardian website, updated live.

UPDATE on the UPDATE: Yours truly gets a quote on the Nature News article about this!

A Little Bit of Quantum

Posted in The Universe and Stuff with tags , , , , , , , , , , , on January 16, 2010 by telescoper

I’m trying to avoid getting too depressed by writing about the ongoing funding crisis for physics in the United Kingdom, so by way of a distraction I thought I’d post something about physics itself rather than the way it is being torn apart by short-sighted bureaucrats. A number of Cardiff physics students are currently looking forward (?) to their Quantum Mechanics examinations next week, so I thought I’d try to remind them of what fascinating subject it really is…

The development of the kinetic theory of gases in the latter part of the 19th Century represented the culmination of a mechanistic approach to Natural Philosophy that had begun with Isaac Newton two centuries earlier. So successful had this programme been by the turn of the 20th century that it was a fairly common view among scientists of the time that there was virtually nothing important left to be “discovered” in the realm of natural philosophy. All that remained were a few bits and pieces to be tidied up, but nothing could possibly shake the foundations of Newtonian mechanics.

But shake they certainly did. In 1905 the young Albert Einstein – surely the greatest physicist of the 20th century, if not of all time – single-handedly overthrew the underlying basis of Newton’s world with the introduction of his special theory of relativity. Although it took some time before this theory was tested experimentally and gained widespread acceptance, it blew an enormous hole in the mechanistic conception of the Universe by drastically changing the conceptual underpinning of Newtonian physics. Out were the “commonsense” notions of absolute space and absolute time, and in was a more complex “space-time” whose measurable aspects depended on the frame of reference of the observer.

Relativity, however, was only half the story. Another, perhaps even more radical shake-up was also in train at the same time. Although Einstein played an important role in this advance too, it led to a theory he was never comfortable with: quantum mechanics. A hundred years on, the full implications of this view of nature are still far from understood, so maybe Einstein was correct to be uneasy.

The birth of quantum mechanics partly arose from the developments of kinetic theory and statistical mechanics that I discussed briefly in a previous post. Inspired by such luminaries as James Clerk Maxwell and Ludwig Boltzmann, physicists had inexorably increased the range of phenomena that could be brought within the descriptive framework furnished by Newtonian mechanics and the new modes of statistical analysis that they had founded. Maxwell had also been responsible for another major development in theoretical physics: the unification of electricity and magnetism into a single system known as electromagnetism. Out of this mathematical tour de force came the realisation that light was a form of electromagnetic wave, an oscillation of electric and magnetic fields through apparently empty space.  Optical light forms just part of the possible spectrum of electromagnetic radiation, which ranges from very long wavelength radio waves at one end to extremely short wave gamma rays at the other.

With Maxwell’s theory in hand, it became possible to think about how atoms and molecules might exchange energy and reach equilibrium states not just with each other, but with light. Everyday experience that hot things tend to give off radiation and a number of experiments – by Wilhelm Wien and others – had shown that there were well-defined rules that determined what type of radiation (i.e. what wavelength) and how much of it were given off by a body held at a certain temperature. In a nutshell, hotter bodies give off more radiation (proportional to the fourth power of their temperature), and the peak wavelength is shorter for hotter bodies. At room temperature, bodies give off infra-red radiation, stars have surface temperatures measured in thousands of degrees so they give off predominantly optical and ultraviolet light. Our Universe is suffused with microwave radiation corresponding to just a few degrees above absolute zero.

The name given to a body in thermal equilibrium with a bath of radiation is a “black body”, not because it is black – the Sun is quite a good example of a black body and it is not black at all – but because it is simultaneously a perfect absorber and perfect emitter of radiation. In other words, it is a body which is in perfect thermal contact with the light it emits. Surely it would be straightforward to apply classical Maxwell-style statistical reasoning to a black body at some temperature?

It did indeed turn out to be straightforward, but the result was a catastrophe. One can see the nature of the disaster very straightforwardly by taking a simple idea from classical kinetic theory. In many circumstances there is a “rule of thumb” that applies to systems in thermal equilibrium. Roughly speaking, the idea is that energy becomes divided equally between every possible “degree of freedom” the system possesses. For example, if a box of gas consists of particles that can move in three dimensions then, on average, each component of the velocity of a particle will carry the same amount of kinetic energy. Molecules are able to rotate and vibrate as well as move about inside the box, and the equipartition rule can apply to these modes too.

Maxwell had shown that light was essentially a kind of vibration, so it appeared obvious that what one had to do was to assign the same amount of energy to each possible vibrational degree of freedom of the ambient electromagnetic field. Lord Rayleigh and Sir James Jeans did this calculation and found that the amount of energy radiated by a black body as a function of wavelength should vary proportionally to the temperature T and to inversely as the fourth power of the wavelength λ, as shown in the diagram for an example temperature of 5000K:

Even without doing any detailed experiments it is clear that this result just has to be nonsense. The Rayleigh-Jeans law predicts that even very cold bodies should produce infinite amounts of radiation at infinitely short wavelengths, i.e. in the ultraviolet. It also predicts that the total amount of radiation – the area under the curve in the above figure – is infinite. Even a very cold body should emit infinitely intense electromagnetic radiation. Infinity is bad.

Experiments show that the Rayleigh-Jeans law does work at very long wavelengths but in reality the radiation reaches a maximum (at a wavelength that depends on the temperature) and then declines at short wavelengths, as shown also in the above Figure. Clearly something is very badly wrong with the reasoning here, although it works so well for atoms and molecules.

It wouldn’t be accurate to say that physicists all stopped in their tracks because of this difficulty. It is amazing the extent to which people are able to carry on despite the presence of obvious flaws in their theory. It takes a great mind to realise when everyone else is on the wrong track, and a considerable time for revolutionary changes to become accepted. In the meantime, the run-of-the-mill scientist tends to carry on regardless.

The resolution of this particular fundamental conundrum is accredited to Karl Ernst Ludwig “Max” Planck (right), who was born in 1858. He was the son of a law professor, and himself went to university at Berlin and Munich, receiving his doctorate in 1880. He became professor at Kiel in 1885, and moved to Berlin in 1888. In 1930 he became president of the Kaiser Wilhelm Institute, but resigned in 1937 in protest at the behaviour of the Nazis towards Jewish scientists. His life was blighted by family tragedies: his second son died in the First World War; both daughters died in childbirth; and his first son was executed in 1944 for his part in a plot to assassinate Adolf Hitler. After the Second World War the institute was named the Max Planck Institute, and Planck was reappointed director. He died in 1947; by then such a famous scientist that his likeness appeared on the two Deutschmark coin issued in 1958.

Planck had taken some ideas from Boltzmann’s work but applied them in a radically new way. The essence of his reasoning was that the ultraviolet catastrophe basically arises because Maxwell’s electromagnetic field is a continuous thing and, as such, appears to have an infinite variety of ways in which it can absorb energy. When you are allowed to store energy in whatever way you like in all these modes, and add them all together you get an infinite power output. But what if there was some fundamental limitation in the way that an atom could exchange energy with the radiation field? If such a transfer can only occur in discrete lumps or quanta – rather like “atoms” of radiation – then one could eliminate the ultraviolet catastrophe at a stroke. Planck’s genius was to realize this, and the formula he proposed contains a constant that still bears his name. The energy of a light quantum E is related to its frequency ν via E=hν, where h is Planck’s constant, one of the fundamental constants that occur throughout theoretical physics.

Boltzmann had shown that if a system possesses a  discrete energy state labelled by j separated by energy Ej then at a given temperature the likely relative occupation of the two states is determined by a “Boltzmann factor” of the form:

n_{j} \propto \exp\left(-\frac{E_{j}}{k_BT}\right),

so that the higher energy state is exponentially less probable than the lower energy state if the energy difference is much larger than the typical thermal energy kB T ; the quantity kB is Boltzmann’s constant, another fundamental constant. On the other hand, if the states are very close in energy compared to the thermal level then they will be roughly equally populated in accordance with the “equipartition” idea I mentioned above.

The trouble with the classical treatment of an electromagnetic field is that it makes it too easy for the field to store infinite energy in short wavelength oscillations: it can put  a little bit of energy in each of a lot of modes in an unlimited way. Planck realised that his idea would mean ultra-violet radiation could only be emitted in very energetic quanta, rather than in lots of little bits. Building on Boltzmann’s reasoning, he deduced the probability of exciting a quantum with very high energy is exponentially suppressed. This in turn leads to an exponential cut-off in the black-body curve at short wavelengths. Triumphantly, he was able to calculate the exact form of the black-body curve expected in his theory: it matches the Rayleigh-Jeans form at long wavelengths, but turns over and decreases at short wavelengths just as the measurements require. The theoretical Planck curve matches measurements perfectly over the entire range of wavelengths that experiments have been able to probe.

Curiously perhaps, Planck stopped short of the modern interpretation of this: that light (and other electromagnetic radiation) is composed of particles which we now call photons. He was still wedded to Maxwell’s description of light as a wave phenomenon, so he preferred to think of the exchange of energy as being quantised rather than the radiation itself. Einstein’s work on the photoelectric effect in 1905 further vindicated Planck, but also demonstrated that light travelled in packets. After Planck’s work, and the development of the quantum theory of the atom pioneered by Niels Bohr, quantum theory really began to take hold of the physics community and eventually it became acceptable to conceive of not just photons but all matter as being part particle and part wave. Photons are examples of a kind of particle known as a boson, and the atomic constituents such as electrons and protons are fermions. (This classification arises from their spin: bosons have spin which is an integer multiple of Planck’s constant, whereas fermions have half-integral spin.)

You might have expected that the radical step made by Planck would immediately have led to a drastic overhaul of the system of thermodynamics put in place in the preceding half-a-century, but you would be wrong. In many ways the realization that discrete energy levels were involved in the microscopic description of matter if anything made thermodynamics easier to understand and apply. Statistical reasoning is usually most difficult when the space of possibilities is complicated. In quantum theory one always deals fundamentally with a discrete space of possible outcomes. Counting discrete things is not always easy, but it’s usually easier than counting continuous things. Even when they’re infinite.

Much of modern physics research lies in the arena of condensed matter physics, which deals with the properties of solids and gases, often at the very low temperatures where quantum effects become important. The statistical thermodynamics of these systems is based on a very slight modification of Boltzmann’s result:

n_{j} \propto \left[\exp\left(\frac{E_{j}}{k_BT}\right)\pm 1\right]^{-1},

which gives the equilibrium occupation of states at an energy level Ej; the difference between bosons and fermions manifests itself as the sign in the denominator. Fermions take the upper “plus” sign, and the resulting statistical framework is based on the so-called Fermi-Dirac distribution; bosons have the minus sign and obey Bose-Einstein statistics. This modification of the classical theory of Maxwell and Boltzmann is simple, but leads to a range of fascinating phenomena, from neutron stars to superconductivity.

Moreover, the nature the ultraviolet catastrophe for black-body radiation at the start of the 20th Century perhaps also holds lessons for modern physics. One of the fundamental problems we have in theoretical cosmology is how to calculate the energy density of the vacuum using quantum field theory. This is a more complicated thing to do than working out the energy in an electromagnetic field, but the net result is a catastrophe of the same sort. All straightforward ways of computing this quantity produce a divergent answer unless a high-energy cut off is introduced. Although cosmological observations of the accelerating universe suggest that vacuum energy is there, its actual energy density is way too small for any plausible cutoff.

So there we are. A hundred years on, we have another nasty infinity. It’s a fundamental problem, but its answer will probably open up a new way of understanding the Universe.


Share/Bookmark

(Guest Post) Letter from America

Posted in Science Politics, The Universe and Stuff with tags , , , , , , , , on January 10, 2010 by telescoper

Synchronicity can be a wonderful thing. Yesterday I mentioned the meeting of the Royal Astronomical Society that took place on January 10th 1930. The importance of this event was that it prompted Lemaître to write to Eddington pointing out that he had already (in 1927) worked out a solution of Einstein’s equations describing an expanding space-time; eventually this led to the widespread acceptance of the idea that Hubble‘s observational measurements of redshifts and distances of extragalactic nebulae were evidence that the Universe was expanding. 

Meanwhile, triggered by a recent article in Physics World, I have been having an entertaining electronic exchange with Bob Kirshner concerning a much more recent development about the expanding universe, namely that its expansion is accelerating. Since he’s one of the top experts on this, I thought “What better time  to have my first ever guest post?” and asked Bob if he would like to write something about that. He accepted the invitation, and here is his piece. 

 -0-

Twenty-first century astrophysicists (like Telescoper) are the wrong people to ask to cast your horoscope or maximize your feng-shui.  But even people who spend time in warm, well-lighted buildings staring at computer screens notice the changing seasons.  (This refers to conditions before the recent budget exercise.)  

For me, the pivot of the year comes right after the solstice, while the Christmas wrapping paper is still in the trash can.  Our house in Maine has a window facing south of east.  When the winter sun rises as far south as it ever does, a clear morning lets a blast of light come in one side, straight down the hallway and out the bathroom window. Househenge!  What does it mean? 

It means it is time for the American Astronomical Society’s big meeting.  This rotates its location from Washington DC, this year’s site, to other more-or-less tolerable climates.  Our tribe can mark the passage of the seasons and of the decades by this rhythm.  Never mind all that highfalutin’ stuff about the earth going around the Sun.  Remember that AAS in Austin? What year was that? 

In January of 1998, the cycle of the seasons and of available convention centers of suitable size put the AAS in Washington.  It was an exciting time for me, because we were hot on the trail of the accelerating universe.  We had some great new data from the Hubble Space Telescope (HST), a paper in the press, and Peter Garnavich, my postdoc, was going to give a talk and be part of a press briefing.  This was a big deal and we prepared carefully.  

Adam Riess, who had been my graduate student, was then a Miller Fellow at Berkeley doing the calibration and analysis on our data.  Adam’s notebooks were beginning to show troubling hints of cosmic acceleration.  I thought it would go away. Brian Schmidt, who had also been my student, was then in Australia,  calling the shots on this project.  He didn’t want to get out on a  limb over unpublished hints.  The idea of a cosmological constant was already making him sick to his stomach.  We agreed that in January of 1998, Peter got to say that the supernova data showed the universe was not decelerating very much and would expand forever.  That’s it.  Nothing about acceleration. 

Saul Perlmutter’s Supernova Cosmology Project also prepared a careful press release that reported a low density and predicted eternal cosmic expansion.  A report the next day in the New York Times was pretty tame, except for Ruth Daly speculating on the possibility of a low-density universe coming out of inflation models. Saul was quoted as saying, “I never underestimate the power of a theorist to come up with a new model.”  I have gathered up all the clippings I could find about who said what in Washington. (We used to call them “clippings”.) 

While a few reporters sniffed out the hints of cosmic acceleration in the raw data, in January 1998 nobody was claiming this was a solid result.  The paper from our team with the title Observational Evidence from Supernovae for an Accelerating Universe and a Cosmological Constant didn’t get submitted until March 13, 1998.  The comparable paper from the SCP was submitted September 8, 1998.  These are fine dates in the history of cosmology, but they are not in January.  It’s not for me to say when savants like the Telescoper were convinced we live in an accelerating universe, but I am pretty sure it wasn’t in January 1998.

In January 2009, the sun was once again shining right through our house.  It illuminated the American Physical Society newsletter kept in the upstairs bathroom. One of the features is This Month in Physics History.  If you want to find out about Bubble Chamber progress in January 1955, this is the place. Flipping through the January 2009 issue I was gobsmacked (American slang for “blown away”) to learn we were supposed to celebrate the anniversary of the discovery of cosmic acceleration.  Say what?  In January?  Because of the press releases that said the universe was not going to turn around? 

Being a dutiful type, a Fellow of the APS, and the oldest of the High-Z Team, I thought it was my job to help improve the accuracy of this journal. I wrote them a cheerful (on the third draft) letter explaining that this wasn’t precisely right, and, if they liked real publications as evidence for scientific progress, they might want to wait until March.  A volley of letters ensued, but not at internet speed.  The editor of APS News decided he had had enough education and closed the discussion in July.  The letters column moved on to less controversial matters concerning science and religion and nuclear reactors. 

The rising point of the sun came north, and then marched south again.    Just after the solstice, a beam of light flashed right though our   happy home. 2010!  Google alerts flashed the news.  More brouhahah about the discovery of cosmic acceleration.   Now in Physics World. I am depicted as a surly bull terrier in a crimson tenured chair, clinging desperately to self-aggrandizing notions that actual  publications in real journals are a way to see the order of events.  The philosopher, Robert P. Crease, who wrote this meditation, says he loves priority disputes.  He is making a serious point, that “Eureka!” is not exactly at one moment when you have an international collaboration, improving data sets, and the powerful tools of Bayesian inference at your command. 

But, even in the world of preprint servers, press releases, and blogs without restraint (I am talking about other blogs!), a higher standard of evidence is demanded for a real paper in a real journal.   A page in a notebook, an email, a group meeting, a comment after a colloquium or even an abstract in the AAS Bulletin (whipped up an hour before the deadline and months before the actual talk) is not quite what we mean by “having a result”.  I’m not saying that referees are always helpful, but they make the author anticipate a skeptical reader, so you really want to present a well-crafted  case.

If that’s not so, I would like to have my lifetime’s page charges refunded forthwith: that’s 250 papers x 10 pages/paper/ x $100/ApJ page = $250 000. Send the  check to my office.

So, Telescoper, how is your house aligned?  And why do the Brits put the drains on the outside when you live in such a cold climate?