Archive for Inflation

A Challenge for Inflationary Cosmologists

Posted in The Universe and Stuff with tags , , on February 6, 2017 by telescoper

A few days ago I wrote a very sceptical post about an alternative to the present standard cosmological which is called the holographic universe. After an interesting discussion thread on that post I thought I’d pose a challenge here. It might be a bit specialist as it is for inflationary theorists and model-builders (a club to which I do not belong) but I thought I’d try it as it might prove education for me as for other readers.

Anyway, the point is that in the inflationary paradigm there is a fairly generic prediction that the primordial scalar power spectrum (related to the spectrum of density fluctuations) takes the form of a power law:

equation-1

The wavenumber is denoted q. There are two free parameters here: the spectral index ns (which is usually close to unity); and an overall normalization amplitude parametrised here at an arbitrary “pivot” scale q*.

In the holographic model the functional form of the spectrum is quite different:
equation-2

This has two different free parameters: g and β, both of which relate to properties of a dual Quantum Field Theory which appears in the model.

The second model is motivated by very different considerations from those behind the inflationary model, but my suspicion is that in fact one could create a version of inflation that produces a spectrum of the form (2) rather than (1). There is an imtimate relationship between the scalar perturbation spectrum and the inflationary dynamics which means that there is considerable freedom to “design” the perturbation spectrum by building features into the potential.

Anyway, that’s the challenge. Would any cosmologists out there with time on their hands please make me an inflationary model that produces the spectrum (2). Alternatively, if this can’t be done, give me a proof why it can’t!

Nailing Cosmological Jelly to the Wall

Posted in The Universe and Stuff with tags , , on November 28, 2016 by telescoper

When asked to provide comments for a recent piece about cosmology in New Scientist, all I could come up with was the quote in the following excerpt:

But no measurement will rule out inflation entirely, because it doesn’t make specific predictions. “There is a huge space of possible inflationary theories, which makes testing the basic idea very difficult,” says Peter Coles at Cardiff University, UK. “It’s like nailing jelly to the wall.”

Certain of my colleagues have cast doubt on whether I am qualified to comment on the nailing of jelly to the wall, so I feel obliged to share the results of my highly successful research into this in the form of the following photograph:
orange_jelly_nailed_to_wall

I regret that I was unable to find any Dark Jelly, so had to settle for the more familiar baryonic type. Also, for the record, I should point out that what is shown is actually jelly concentrate. A similar experiment with the more normal diluted form of jelly was somewhat less successful.

I hope this clarifies the situation.

MaxEnt 2016: Norton’s Dome and the Cosmological Density Parameter

Posted in The Universe and Stuff with tags , , , on July 11, 2016 by telescoper

The second in my sequence of posts tangentially related to talks at this meeting on Maximum Entropy and Bayesian Methods in Science and Engineering is inspired by a presentation this morning by Sylvia Wenmackers. The talk featured an example which was quite new to me called Norton’s Dome. There’s a full discussion of the implications of this example at John D. Norton’s own website, from which I have taken the following picture:

dome_with_eqn

This is basically a problem in Newtonian mechanics, in which a particle rolls down from the apex of a dome with a particular shape in response to a vertical gravitational field. The solution is well-determined and shown in the diagram.

An issue arises, however, when you consider the case where the particle starts at the apex of the dome with zero velocity. One solution in this case is that the particle stays put forever. However it can be shown that there are other solutions in which the particle sits at the top for an arbitrary (finite) time before rolling down. An example could be for example if the particle were launched up the dome from some point with just enough kinetic energy to reach the top where it is momentarily at rest, but then rolls down again.

Norton argues that this problem demonstrates a certain kind of indeterminism in Newtonian Mechanics. The mathematical problem with the specified initial conditions clearly has a solution in which the ball stays at the top forever. This solution is unstable, which is a familiar situation in mechanics, but this equilibrium has an unusual property related to the absence of Lipschitz continuity. One might expect that an infinitesimal asymmetric perturbation of the particle or the shape of the surface would be needed to send the particle rolling down the slope, but in this case it doesn’t. This is because there isn’t just one solution that has zero velocity at the equilibrium, but an entirely family as described above. This is both curious and interesting, and it does raise the question of how to define a probability measure that describes these solutions.

I don’t really want to go into the philosophical implications of this cute example, but it did strike me that there’s a similarity with an interesting issue in cosmology that I’ve blogged about before (in different terms).

This probably seems to have very little to do with physical cosmology, but now forget about domes and think instead about the behaviour of the mathematical models that describe the Big Bang. To keep things simple, I’m going to ignore the cosmological constant and just consider how things depend on one parameter, the density parameter Ω0. This is basically the ratio between the present density of the matter in the Universe compared to what it would have to be to cause the expansion of the Universe eventually to halt. To put it a slightly different way, it measures the total energy of the Universe. If Ω0>1 then the total energy of the Universe is negative: its (negative) gravitational potential energy dominates over the (positive) kinetic energy. If Ω0<1 then the total energy is positive: kinetic trumps potential. If Ω0=1 exactly then the Universe has zero total energy: energy is precisely balanced, like the man on the tightrope.

A key point, however, is that the trade-off between positive and negative energy contributions changes with time. The result of this is that Ω is not fixed at the same value forever, but changes with cosmic epoch; we use Ω0 to denote the value that it takes now, at cosmic time t0, but it changes with time.

At the beginning, i.e. at the Big Bang itself,  all the Friedmann models begin with Ω arbitrarily close to unity at arbitrarily early times, i.e. the limit as t tends to zero is Ω=1.

In the case in which the Universe emerges from the Big bang with a value of Ω just a tiny bit greater than one then it expands to a maximum at which point the expansion stops. During this process Ω grows without bound. Gravitational energy wins out over its kinetic opponent.

If, on the other hand, Ω sets out slightly less than unity – and I mean slightly, one part in 1060 will do – the Universe evolves to a state where it is very close to zero. In this case kinetic energy is the winner  and Ω ends up on the ground, mathematically speaking.

In the compromise situation with total energy zero, this exact balance always applies. The universe is always described by Ω=1. It walks the cosmic tightrope. But any small deviation early on results in runaway expansion or catastrophic recollapse. To get anywhere close to Ω=1 now – I mean even within a factor ten either way – the Universe has to be finely tuned.

The evolution of Ω  is neatly illustrated by the following phase-plane diagram (taken from an old paper by Madsen & Ellis) describing a cosmological model involving a perflect fluid with an equation of state p=(γ-1)ρc2. This is what happens for γ>2/3 (which includes dust, relativistic particles, etc):

Phase_plane_crop

The top panel shows how the density parameter evolves with scale factor S; the bottom panel shows a completion of this portrait obtained using a transformation that allows the point at infinity to be plotted on a finite piece of paper (or computer screen).

As discussed above this picture shows that all these Friedmann models begin at S=0 with Ω arbitrarily close to unity and that the value of Ω=1 is an unstable fixed point, just like the situation of the particle at the top of the dome. If the universe has Ω=1 exactly at some time then it will stay that way forever. If it is perturbed, however, then it will eventually diverge and end up collapsing (Ω>1) or going into free expansion (Ω<1).  The smaller the initial perturbation,  the longer the system stays close to Ω=1.

The fact that all trajectories start at Ω(S=0)=1 means that one has to be very careful in assigning some sort of probability measure on this parameter, ust as is the case with the Norton’s Dome problem I started with. About twenty years ago, Guillaume Evrard and I tried to put this argument on firmer mathematical grounds by assigning a sensible prior probability to Ω based on nothing other than the assumption that our Universe is described by a Friedmann model.

The result we got was that it should be of the form

P(\Omega) \propto \Omega^{-1}(\Omega-1)^{-1}.

I was very pleased with this result, which is based on a principle advanced by physicist Ed Jaynes, but I have no space to go through the mathematics here. Note, however, that this prior has three interesting properties: it is infinite at Ω=0 and Ω=1, and it has a very long “tail” for very large values of Ω. It’s not a very well-behaved measure, in the sense that it can’t be integrated over, but that’s not an unusual state of affairs in this game. In fact it is what is called an improper prior.

I think of this prior as being the probabilistic equivalent of Mark Twain’s description of a horse:

dangerous at both ends, and uncomfortable in the middle.

Of course the prior probability doesn’t tell usall that much. To make further progress we have to make measurements, form a likelihood and then, like good Bayesians, work out the posterior probability . In fields where there is a lot of reliable data the prior becomes irrelevant and the likelihood rules the roost. We weren’t in that situation in 1995 – and we’re arguably still not – so we should still be guided, to some extent by what the prior tells us.

The form we found suggests that we can indeed reasonably assign most of our prior probability to the three special cases I have described. Since we also know that the Universe is neither totally empty nor ready to collapse, it does indicate that, in the absence of compelling evidence to the contrary, it is quite reasonable to have a prior preference for the case Ω=1.  Until the late 1980s there was indeed a strong ideological preference for models with Ω=1 exactly, but not because of the rather simple argument given above but because of the idea of cosmic inflation.

From recent observations we now know, or think we know, that Ω is roughly 0.26. To put it another way, this means that the Universe has roughly 26% of the density it would need to have to halt the cosmic expansion at some point in the future. Curiously, this corresponds precisely to the unlikely or “fine-tuned” case where our Universe is in between  two states in which we might have expected it to lie.

Even if you accept my argument that Ω=1 is a special case that is in principle possible, it is still the case that it requires the Universe to have been set up with very precisely defined initial conditions. Cosmology can always appeal to special initial conditions to get itself out of trouble because we don’t know how to describe the beginning properly, but it is much more satisfactory if properties of our Universe are explained by understanding the physical processes involved rather than by simply saying that “things are the way they are because they were the way they were.” The latter statement remains true, but it does not enhance our understanding significantly. It’s better to look for a more fundamental explanation because, even if the search is ultimately fruitless, we might turn over a few interesting stones along the way.

The reasoning behind cosmic inflation admits the possibility that, for a very short period in its very early stages, the Universe went through a phase where it was dominated by a third form of energy, vacuum energy. This forces the cosmic expansion to accelerate; this means basically that the equation of state of the contents of the universe is described by γ<2/3 rather than the case γ>2/3 described above. This drastically changes the arguments I gave above.

Without inflation the case with Ω=1 is unstable: a slight perturbation to the Universe sends it diverging towards a Big Crunch or a Big Freeze. While inflationary dynamics dominate, however, this case has a very different behaviour. Not only stable, it becomes an attractor to which all possible universes converge. Here’s what the phase plane looks like in this case:

Phase_plane+2_crop

 

Whatever the pre-inflationary initial conditions, the Universe will emerge from inflation with Ω very close to unity.

So how can we reconcile inflation with current observations that suggest a low matter density? The key to this question is that what inflation really does is expand the Universe by such a large factor that the curvature radius becomes infinitesimally small. If there is only “ordinary” matter in the Universe then this requires that the universe have the critical density. However, in Einstein’s theory the curvature is zero only if the total energy is zero. If there are other contributions to the global energy budget besides that associated with familiar material then one can have a low value of the matter density as well as zero curvature. The missing link is dark energy, and the independent evidence we now have for it provides a neat resolution of this problem.

Or does it? Although spatial curvature doesn’t really care about what form of energy causes it, it is surprising to some extent that the dark matter and dark energy densities are similar. To many minds this unexplained coincidence is a blemish on the face of an otherwise rather attractive structure.

It can be argued that there are initial conditions for non-inflationary models that lead to a Universe like ours. This is true. It is not logically necessary to have inflation in order for the Friedmann models to describe a Universe like the one we live in. On the other hand, it does seem to be a reasonable argument that the set of initial data that is consistent with observations is larger in models with inflation than in those without it. It is rational therefore to say that inflation is more probable to have happened than the alternative.

I am not totally convinced by this reasoning myself, because we still do not know how to put a reasonable measure on the space of possibilities existing prior to inflation. This would have to emerge from a theory of quantum gravity which we don’t have. Nevertheless, inflation is a truly beautiful idea that provides a framework for understanding the early Universe that is both elegant and compelling. So much so, in fact, that I almost believe it.

 

Do Primordial Fluctuations have a Quantum Origin?

Posted in The Universe and Stuff with tags , , , , , , on October 21, 2015 by telescoper

A quick lunchtime post containing a confession and a question, both inspired by an interesting paper I found recently on the arXiv with the abstract:

We investigate the quantumness of primordial cosmological fluctuations and its detectability. The quantum discord of inflationary perturbations is calculated for an arbitrary splitting of the system, and shown to be very large on super-Hubble scales. This entails the presence of large quantum correlations, due to the entangled production of particles with opposite momentums during inflation. To determine how this is reflected at the observational level, we study whether quantum correlators can be reproduced by a non-discordant state, i.e. a state with vanishing discord that contains classical correlations only. We demonstrate that this can be done for the power spectrum, the price to pay being twofold: first, large errors in other two-point correlation functions, that cannot however be detected since hidden in the decaying mode; second, the presence of intrinsic non-Gaussianity the detectability of which remains to be determined but which could possibly rule out a non-discordant description of the Cosmic Microwave Background. If one abandons the idea that perturbations should be modeled by Quantum Mechanics and wants to use a classical stochastic formalism instead, we show that any two-point correlators on super-Hubble scales can exactly be reproduced regardless of the squeezing of the system. The later becomes important only for higher order correlation functions, that can be accurately reproduced only in the strong squeezing regime.

I won’t comment on the use of the word “quantumness” nor the plural “momentums”….

My confession is that I’ve never really followed the logic that connects the appearance of classical fluctuations to the quantum description of fields in models of the early Universe. People have pointed me to papers that claim to spell this out, but they all seem to miss the important business of what it means to “become classical” in the cosmological setting. My question, therefore, is can anyone please point me to a book or a paper that addresses this issue rigorously?

Please let me know through the comments box, which you can also use to comment on the paper itself…

Planck Update

Posted in The Universe and Stuff with tags , , , , on February 5, 2015 by telescoper

Just time for a very quick post today to pass on thhe news that most of the 2015 crop of papers from the Planck mission have now been released and are available to download here. You can also find some related data products here.

I haven’t had time to look at these in any detail myself, but my attention was drawn (in the light of the recently-released combined analysis of Planck and Bicpe2/Keck data) to the constraints on inflationary cosmological models shown in this figure:

inflation

It seems that the once-popular (because it is simple) m^2 \phi^2 model of inflation is excluded at greater than 99% confidence…

Feel free to add reactions to any of the papers in the new release via the comments box!

BICEP2 bites the dust.. or does it?

Posted in Bad Statistics, Open Access, Science Politics, The Universe and Stuff with tags , , , , , , , , on September 22, 2014 by telescoper

Well, it’s come about three weeks later than I suggested – you should know that you can never trust anything you read in a blog – but the long-awaited Planck analysis of polarized dust emission from our Galaxy has now hit the arXiv. Here is the abstract, which you can click on to make it larger:

PlanckvBICEP2

My twitter feed was already alive with reactions to the paper when I woke up at 6am, so I’m already a bit late on the story, but I couldn’t resist a quick comment or two.

The bottom line is of course that the polarized emission from Galactic dust is much larger in the BICEP2 field than had been anticipated in the BICEP2 analysis of their data (now published  in Physical Review Letters after being refereed). Indeed, as the abstract states, the actual dust contamination in the BICEP2 field is subject to considerable statistical and systematic uncertainties, but seems to be around the same level as BICEP2’s claimed detection. In other words the Planck analysis shows that the BICEP2 result is completely consistent with what is now known about polarized dust emission.  To put it bluntly, the Planck analysis shows that the claim that primordial gravitational waves had been detected was premature, to say the least. I remind you that the original  BICEP2 result was spun as a ‘7σ’ detection of a primordial polarization signal associated with gravitational waves. This level of confidence is now known to have been false.  I’m going to resist (for the time being) another rant about p-values

Although it is consistent with being entirely dust, the Planck analysis does not entirely kill off the idea that there might be a primordial contribution to the BICEP2 measurement, which could be of similar amplitude to the dust signal. However, identifying and extracting that signal will require the much more sophisticated joint analysis alluded to in the final sentence of the abstract above. Planck and BICEP2 have differing strengths and weaknesses and a joint analysis will benefit from considerable complementarity. Planck has wider spectral coverage, and has mapped the entire sky; BICEP2 is more sensitive, but works at only one frequency and covers only a relatively small field of view. Between them they may be able to identify an excess source of polarization over and above the foreground, so it is not impossible that there may a gravitational wave component may be isolated. That will be a tough job, however, and there’s by no means any guarantee that it will work. We will just have to wait and see.

In the mean time let’s see how big an effect this paper has on my poll:

 

 

Note also that the abstract states:

We show that even in the faintest dust-emitting regions there are no “clean” windows where primordial CMB B-mode polarization could be measured without subtraction of dust emission.

It is as I always thought. Our Galaxy is a rather grubby place to live. Even the windows are filthy. It’s far too dusty for fussy cosmologists, who need to have everything just so, but probably fine for astrophysicists who generally like mucking about and getting their hands dirty…

This discussion suggests that a confident detection of B-modes from primordial gravitational waves (if there is one to detect) may have to wait for a sensitive all-sky experiment, which would have to be done in space. On the other hand, Planck has identified some regions which appear to be significantly less contaminated than the BICEP2 field (which is outlined in black):

Quieter dust

Could it be possible to direct some of the ongoing ground- or balloon-based CMB polarization experiments towards the cleaner (dark blue area in the right-hand panel) just south of the BICEP2 field?

From a theorist’s perspective, I think this result means that all the models of the early Universe that we thought were dead because they couldn’t produce the high level of primordial gravitational waves detected by BICEP2 have no come back to life, and those that came to life to explain the BICEP2 result may soon be read the last rites if the signal turns out to be predominantly dust.

Another important thing that remains to be seen is the extent to which the extraordinary media hype surrounding the announcement back in March will affect the credibility of the BICEP2 team itself and indeed the cosmological community as a whole. On the one hand, there’s nothing wrong with what has happened from a scientific point of view: results get scrutinized, tested, and sometimes refuted.  To that extent all this episode demonstrates is that science works.  On the other hand most of this stuff usually goes on behind the scenes as far as the public are concerned. The BICEP2 team decided to announce their results by press conference before they had been subjected to proper peer review. I’m sure they made that decision because they were confident in their results, but it now looks like it may have backfired rather badly. I think the public needs to understand more about how science functions as a process, often very messily, but how much of this mess should be out in the open?

 

UPDATE: Here’s a piece by Jonathan Amos on the BBC Website about the story.

ANOTHER UPDATE: Here’s the Physics World take on the story.

ANOTHER OTHER UPDATE: A National Geographic story

BICEP2 Redux: How the Sausage is Made

Posted in The Universe and Stuff with tags , , , on July 6, 2014 by telescoper

I came across this (rather lengthy, but very good) discussion of the BICEP2 story so far so thought I would share it here. There’s a particularly useful collection of articles at the end for those who would like to read more.

I’ll also take this opportunity to refer you to a recent BBC News story which states that the BICEP2 and Planck teams are now in discussions about sharing data. About time, if you ask me. Still, it will take a considerable time to work out the ordering of the authors if they ever do write a paper!

Whiskey...Tango...Foxtrot?

An ongoing problem with communicating science to the general public is the existence of widely-held misconceptions among the public regarding how science actually works. A case in point is the March 17 announcement by the BICEP2 Collaboration regarding the detection of B-Mode polarization in the Cosmic Microwave Background and the events which have unfolded since then.

All too often, news stories and blog posts will trumpet some announcement with sensational headlines like “Scientists Say Cheap, Efficient Solar Cells Just Around the Corner”, or “Scientists Close in on Cure for Cancer.” Many people take such announcements at face value and consider the case closed. The work has been done.  The reality of the situation, however, is that the initial announcement of a discovery or breakthrough is just the beginning of the hard work, breathlessly hyped headlines notwithstanding.

How Science Actually Works (or at least how it is supposed to work)

Once…

View original post 2,843 more words