Archive for September, 2009

Astrostats

Posted in Bad Statistics, The Universe and Stuff with tags , , , , , , , , , on September 20, 2009 by telescoper

A few weeks ago I posted an item on the theme of how gambling games were good for the development of probability theory. That piece  contained a mention of one astronomer (Christiaan Huygens), but I wanted to take the story on a little bit to make the historical connection between astronomy and statistics more explicit.

Once the basics of mathematical probability had been worked out, it became possible to think about applying probabilistic notions to problems in natural philosophy. Not surprisingly, many of these problems were of astronomical origin but, on the way, the astronomers that tackled them also derived some of the basic concepts of statistical theory and practice. Statistics wasn’t just something that astronomers took off the shelf and used; they made fundamental contributions to the development of the subject itself.

The modern subject we now know as physics really began in the 16th and 17th century, although at that time it was usually called Natural Philosophy. The greatest early work in theoretical physics was undoubtedly Newton’s great Principia, published in 1687, which presented his idea of universal gravitation which, together with his famous three laws of motion, enabled him to account for the orbits of the planets around the Sun. But majestic though Newton’s achievements undoubtedly were, I think it is fair to say that the originator of modern physics was Galileo Galilei.

Galileo wasn’t as much of a mathematical genius as Newton, but he was highly imaginative, versatile and (very much unlike Newton) had an outgoing personality. He was also an able musician, fine artist and talented writer: in other words a true Renaissance man.  His fame as a scientist largely depends on discoveries he made with the telescope. In particular, in 1610 he observed the four largest satellites of Jupiter, the phases of Venus and sunspots. He immediately leapt to the conclusion that not everything in the sky could be orbiting the Earth and openly promoted the Copernican view that the Sun was at the centre of the solar system with the planets orbiting around it. The Catholic Church was resistant to these ideas. He was hauled up in front of the Inquisition and placed under house arrest. He died in the year Newton was born (1642).

These aspects of Galileo’s life are probably familiar to most readers, but hidden away among scientific manuscripts and notebooks is an important first step towards a systematic method of statistical data analysis. Galileo performed numerous experiments, though he certainly carry out the one with which he is most commonly credited. He did establish that the speed at which bodies fall is independent of their weight, not by dropping things off the leaning tower of Pisa but by rolling balls down inclined slopes. In the course of his numerous forays into experimental physics Galileo realised that however careful he was taking measurements, the simplicity of the equipment available to him left him with quite large uncertainties in some of the results. He was able to estimate the accuracy of his measurements using repeated trials and sometimes ended up with a situation in which some measurements had larger estimated errors than others. This is a common occurrence in many kinds of experiment to this day.

Very often the problem we have in front of us is to measure two variables in an experiment, say X and Y. It doesn’t really matter what these two things are, except that X is assumed to be something one can control or measure easily and Y is whatever it is the experiment is supposed to yield information about. In order to establish whether there is a relationship between X and Y one can imagine a series of experiments where X is systematically varied and the resulting Y measured.  The pairs of (X,Y) values can then be plotted on a graph like the example shown in the Figure.

XY

In this example on it certainly looks like there is a straight line linking Y and X, but with small deviations above and below the line caused by the errors in measurement of Y. This. You could quite easily take a ruler and draw a line of “best fit” by eye through these measurements. I spent many a tedious afternoon in the physics labs doing this sort of thing when I was at school. Ideally, though, what one wants is some procedure for fitting a mathematical function to a set of data automatically, without requiring any subjective intervention or artistic skill. Galileo found a way to do this. Imagine you have a set of pairs of measurements (xi,yi) to which you would like to fit a straight line of the form y=mx+c. One way to do it is to find the line that minimizes some measure of the spread of the measured values around the theoretical line. The way Galileo did this was to work out the sum of the differences between the measured yi and the predicted values mx+c at the measured values x=xi. He used the absolute difference |yi-(mxi+c)| so that the resulting optimal line would, roughly speaking, have as many of the measured points above it as below it. This general idea is now part of the standard practice of data analysis, and as far as I am aware, Galileo was the first scientist to grapple with the problem of dealing properly with experimental error.

error

The method used by Galileo was not quite the best way to crack the puzzle, but he had it almost right. It was again an astronomer who provided the missing piece and gave us essentially the same method used by statisticians (and astronomy) today.

Karl Friedrich Gauss was undoubtedly one of the greatest mathematicians of all time, so it might be objected that he wasn’t really an astronomer. Nevertheless he was director of the Observatory at Göttingen for most of his working life and was a keen observer and experimentalist. In 1809, he developed Galileo’s ideas into the method of least-squares, which is still used today for curve fitting.

This approach involves basically the same procedure but involves minimizing the sum of [yi-(mxi+c)]2 rather than |yi-(mxi+c)|. This leads to a much more elegant mathematical treatment of the resulting deviations – the “residuals”.  Gauss also did fundamental work on the mathematical theory of errors in general. The normal distribution is often called the Gaussian curve in his honour.

After Galileo, the development of statistics as a means of data analysis in natural philosophy was dominated by astronomers. I can’t possibly go systematically through all the significant contributors, but I think it is worth devoting a paragraph or two to a few famous names.

I’ve already mentioned Jakob Bernoulli, whose famous book on probability was probably written during the 1690s. But Jakob was just one member of an extraordinary Swiss family that produced at least 11 important figures in the history of mathematics.  Among them was Daniel Bernoulli who was born in 1700.  Along with the other members of his famous family, he had interests that ranged from astronomy to zoology. He is perhaps most famous for his work on fluid flows which forms the basis of much of modern hydrodynamics, especially Bernouilli’s principle, which accounts for changes in pressure as a gas or liquid flows along a pipe of varying width.
But the elder Jakob’s work on gambling clearly also had some effect on Daniel, as in 1735 the younger Bernoulli published an exceptionally clever study involving the application of probability theory to astronomy. It had been known for centuries that the orbits of the planets are confined to the same part in the sky as seen from Earth, a narrow band called the Zodiac. This is because the Earth and the planets orbit in approximately the same plane around the Sun. The Sun’s path in the sky as the Earth revolves also follows the Zodiac. We now know that the flattened shape of the Solar System holds clues to the processes by which it formed from a rotating cloud of cosmic debris that formed a disk from which the planets eventually condensed, but this idea was not well established in the time of Daniel Bernouilli. He set himself the challenge of figuring out what the chance was that the planets were orbiting in the same plane simply by chance, rather than because some physical processes confined them to the plane of a protoplanetary disk. His conclusion? The odds against the inclinations of the planetary orbits being aligned by chance were, well, astronomical.

The next “famous” figure I want to mention is not at all as famous as he should be. John Michell was a Cambridge graduate in divinity who became a village rector near Leeds. His most important idea was the suggestion he made in 1783 that sufficiently massive stars could generate such a strong gravitational pull that light would be unable to escape from them.  These objects are now known as black holes (although the name was coined much later by John Archibald Wheeler). In the context of this story, however, he deserves recognition for his use of a statistical argument that the number of close pairs of stars seen in the sky could not arise by chance. He argued that they had to be physically associated, not fortuitous alignments. Michell is therefore credited with the discovery of double stars (or binaries), although compelling observational confirmation had to wait until William Herschel’s work of 1803.

It is impossible to overestimate the importance of the role played by Pierre Simon, Marquis de Laplace in the development of statistical theory. His book A Philosophical Essay on Probabilities, which began as an introduction to a much longer and more mathematical work, is probably the first time that a complete framework for the calculation and interpretation of probabilities ever appeared in print. First published in 1814, it is astonishingly modern in outlook.

Laplace began his scientific career as an assistant to Antoine Laurent Lavoiser, one of the founding fathers of chemistry. Laplace’s most important work was in astronomy, specifically in celestial mechanics, which involves explaining the motions of the heavenly bodies using the mathematical theory of dynamics. In 1796 he proposed the theory that the planets were formed from a rotating disk of gas and dust, which is in accord with the earlier assertion by Daniel Bernouilli that the planetary orbits could not be randomly oriented. In 1776 Laplace had also figured out a way of determining the average inclination of the planetary orbits.

A clutch of astronomers, including Laplace, also played important roles in the establishment of the Gaussian or normal distribution.  I have also mentioned Gauss’s own part in this story, but other famous astronomers played their part. The importance of the Gaussian distribution owes a great deal to a mathematical property called the Central Limit Theorem: the distribution of the sum of a large number of independent variables tends to have the Gaussian form. Laplace in 1810 proved a special case of this theorem, and Gauss himself also discussed it at length.

A general proof of the Central Limit Theorem was finally furnished in 1838 by another astronomer, Friedrich Wilhelm Bessel– best known to physicists for the functions named after him – who in the same year was also the first man to measure a star’s distance using the method of parallax. Finally, the name “normal” distribution was coined in 1850 by another astronomer, John Herschel, son of William Herschel.

I hope this gets the message across that the histories of statistics and astronomy are very much linked. Aspiring young astronomers are often dismayed when they enter research by the fact that they need to do a lot of statistical things. I’ve often complained that physics and astronomy education at universities usually includes almost nothing about statistics, because that is the one thing you can guarantee to use as a researcher in practically any branch of the subject.

Over the years, statistics has become regarded as slightly disreputable by many physicists, perhaps echoing Rutherford’s comment along the lines of “If your experiment needs statistics, you ought to have done a better experiment”. That’s a silly statement anyway because all experiments have some form of error that must be treated statistically, but it is particularly inapplicable to astronomy which is not experimental but observational. Astronomers need to do statistics, and we owe it to the memory of all the great scientists I mentioned above to do our statistics properly.

La Traviata

Posted in Opera with tags , on September 19, 2009 by telescoper

Summer must be over: the students are returning to University next week;  the cricket season is just about to end; the football season is well under way; the Last Night of the Proms is all done and dusted. But at least it all means the Opera season has started again!

Last night I went to the Wales Millennium Centre to see Welsh National Opera’s production of La Traviata by Giuseppe Verdi. Actually, to be precise, this was a co-production with Scottish Opera who supplied the sets scenery and costumes, it was directed by David McVicar and was first staged in Scotland before transferring to Wales for this run.

La Traviata is one of the most enduringly popular of all operas – and is one of the most frequently performed. It’s quite curious that its first performance in Venice was a complete disaster and it took several revisions before it became established as part of the operatic repertoire. A production like the one we saw last night, however, makes it abundantly clear why it is such an evergreen classic. Act I in particular is just one memorable tune after another.

The opera is based on the novel La Dame Aux Camélias which later became a play with the same name. It tells the story of Violetta, a glamorous courtesan and flamboyant darling of the Paris party scene. She meets a young chap called Alfredo at a spectacular do in her house in Act I and he tells her he’s completely in love with her. She laughs him off and he departs crestfallen. When the party’s over and  he’s gone, though, she finds herself thinking about him. The trouble with Violetta is that she is already seriously ill with consumption (tuberculosis) at the start. She knows that she is doomed to die and is torn between her desire to be free and her growing love for Alfredo.

Cut to Act II, Scene I, a few months later. Violetta and Alfredo are shacked up in a love nest away from Paris. While Alfredo is away paying off some of Violetta’s bills, Alfredo’s father Giorgio turns up and tries to convince Violetta to abandon her relationship with his son because its scandalous nature threatens their family’s prospects, particular his daughter’s (Alfredo’s sisters) plans to get married. Violetta eventually agrees to do a runner. Alfredo returns and meets his father who tries to convince him to return to his family in Provence. Alfredo is distraught to hear of Violetta’s departure, refuses to go with his father, and vows to find Violetta again.

Scene 2 is back in Paris, at the house of a lady called Flora. There’s a lot of singing and dancing and general riotousness.Alfredo turns up, slightly the worse for drink and proceeds to gamble (winning a huge amout of money). Violetta turns up and Alfredo insults her by throwing his winnings at her. He’s then overcome by remorse but the Baron Douphol, a wealthy friend of Violetta, is outraged and challenges Alfredo to a duel.

Act III is set a few months later in Violetta’s bedroom where she’s clearly dying. Alfredo has run off after wounding the Baron in a duel. The doctor gives Violetta just a few hours to live. Alfredo returns. The lovers forgive each other and embrace. Violetta dies.

In this performance Violetta was Greek soprano Myrtò Papatanasiu, a name that’s quite new to me. She’s tall, elegant and has a lovely voice. Violetta is quite a demanding role- there are several tricky coloratura passages to cope with – but her character is quite complicated too. Although we know she’s ill right from the start she’s not by any means a passive victim. She’s a courtesan who has clearly put it about a bit, but she’s also got a strong moral sense. She’s vulnerable, but also at times very strong. I thought Myrtò Papatanasiu was a wonderful Violetta who not only sang beautifully but had a mesmerising stage presence.

The other star of the show (for me) was Dario Solari as Alfredo’s father. His richly textured baritone voice was a revelation to me. He was quite limited as an actor but musically excellent.

Tenor Alfie Boe’s Alfredo was less convincing. His voice was not as powerful as the other principals and at times he sounded very strained. He’s quite small in stature as well as voice and I found it hard to imagine that this particular Violetta would fall so dramatically for him. However Alfredo is torn between the powerful personalities of Violetta and his father so in a strange way his relative weakness worked out pretty well in that mixture.

The  look of the opera – staging and costumes – was also stunning. The Paris parties were riots of colour and movement with just as much debauchery as desired.

All in all an excellent production which I thoroughly enjoyed from start to finish. It was so good, in fact, that even after seeing it many times, and knowing very well what was going to happen, the final scene of Violetta’s death was still deeply moving. My love of Italian opera makes me regret even more that the UK will be be leaving the European Union in 2020.

Finally, I should also mention that La Traviata has a wonderful overture. I’ll probably stop going to opera when I no longer get butterflies in my stomach during the overture. It’s childish but I still get excited like that sitting  in the theatre waiting for the performance to start. This overture certainly does that for me, and it also underlines the  underlying tragedy of the story. Opening with ghostly strings that presage Violetta’s inevitable death, it then bursts into one of the beautiful melodies that Verdi seemed to be able to produce at the drop of a hat. Genius.

A Well Placed Lecture

Posted in The Universe and Stuff with tags , on September 18, 2009 by telescoper

I noticed that the UK government has recently dropped its ban on product placement in television programmes. I wanted to take this opportunity to state Virgin Airlines that I will not be taking this as a Carling cue to introduce subliminal Coca Cola advertising of any Corby Trouser Press form into this blog.

This week I’ve been giving Marks and Spencer lectures every AIG afternoon to groups of 200 sixth form Samsung students on the subject of the Burger King Big Bang. The talks seemed to go down BMW quite well although I had Betfair trouble sometimes cramming all the Sainsbury things I wanted to talk about in the Northern Rock 30 minutes I was allotted. Anyway, I went through the usual stuff about the Carlsberg cosmic microwave background (CMB), even showing the noise on a Sony television screen to explain that a bit of the Classic FM signal came from the edge of the Next Universe.  The CMB played an Emirates important role in the talk as it is the Marlboro smoking gun of the Big Bang and established our Standard Life model of L’Oreal cosmology.

The timing of these lectures was Goodfella’s Pizza excellent because I was able to include Crown Paints references to the Hubble Ultra Deep Kentucky Fried Chicken Field and the Planck First Direct initial results that I’ve blogged about in the past week or so.

Now that’s all over, Thank God It’s Friday and  I’m getting ready to go to the Comet Sale Now On Opera. ..

First Light from Planck!

Posted in The Universe and Stuff with tags , , , on September 17, 2009 by telescoper

Credit to Andrew Jaffe for alerting me to the fact that ESA’s first press release concerning Planck has now been, well, released…

I last blogged about Planck when it had reached its orbit around L2 and cooled down to its working temperature of 100 milliKelvin. Over the ensuing weeks it has been tested and calibrated, prodded and poked (electronically of course) and generally tuned up. More recently it has completed a “mini-survey” to check that it’s all working as planned.

The way Planck scans means that it takes about six months to cover the whole sky, which is much longer than the two-week period allowed for the mini-survey. This explains the fact that a relatively narrow slice of the celestial sphere has been mapped. However, you can see the foreground emission from the Galactic plane quite clearly. Here is the region shown in the box split into the nine separate frequency channels that Planck observes:

The High Frequency Instrument (HFI) is more sensitive to dust, while the Low Frequency Instrument (LFI) detects more radio emission. It all seems to be working as expected!

And finally here’s a blow up of the smaller square above the Galactic plane shown as seen by  LFI and HFI:

This region is much less prone to foreground emission. The fact that similar structures are seen in the two completely independent receivers shows that the structure is not just instrument noise. In other words, Planck is seeing the cosmic microwave background!

Now Planck will carry out its full survey, scanning the sky for another year or so. There will then be an intense period of data analysis for about another year after which the key science results will be published. Exciting times.

Birthday Blog

Posted in Columbo on September 16, 2009 by telescoper

It’s crept up on me and I wouldn’t have noticed unless it had been pointed out to me last week. Today, 16th September, is the anniversary of my first ever blog post, so today is this blog’s first birthday.

On behalf of Columbo and myself I thought I’d take this opportunity to thank everyone who has contributed with comments, questions, corrections for my numerous errors, and incoming links which have helped increase my readership to the stratospheric level of 300 hits a day.

Columbo himself was particularly excited by the news of this blog’s birthday, as you can tell from the picture below.

P1010033

In case you’re interested, the most popular post of the year was this one, with more hits than any other by an enormous factor. I realise that I could raise this blog’s profile further by adding similar off-colour content involving sexual innuendo, but I refuse to descend to such a level and will instead celebrate by showing a picture of a hot pussy.

P1010027

Lessening Anomalies

Posted in Cosmic Anomalies, The Universe and Stuff with tags , , , , , on September 15, 2009 by telescoper

An interesting paper caught my eye on today’s ArXiv and I thought I’d post something here because it relates to an ongoing theme on this blog about the possibility that there might be anomalies in the observed pattern of temperature fluctuations in the cosmic microwave background (CMB). See my other posts here, here, here, here and here for related discussions.

One of the authors of the new paper, John Peacock, is an occasional commenter on this blog. He was also the Chief Inquisitor at my PhD (or rather DPhil) examination, which took place 21 years ago. The four-and-a-half hours of grilling I went through that afternoon reduced me to a gibbering wreck but the examiners obviously felt sorry for me and let me pass anyway. I’m not one to hold a grudge so I’ll resist the temptation to be churlish towards my erstwhile tormentor.

The most recent paper is about the possible  contribution of  the integrated Sachs-Wolfe (ISW) effect to these anomalies. The ISW mechanism generates temperature variations in the CMB because photons travel along a line of sight through a time-varying gravitational potential between the last-scattering surface and the observer. The integrated effect is zero if the potential does not evolve because the energy shift falling into a well exactly balances that involved in climbing out of one. If in transit the well gets a bit deeper, however, there is a net contribution.

The specific thing about the ISW effect that makes it measurable is that the temperature variations it induces should correlate with the pattern of structure in the galaxy distribution, as it is these that generate the potential fluctuations through which CMB photons travel. Francis & Peacock try to assess the ISW contribution using data from the 2MASS all-sky survey of galaxies. This in itself contains important cosmological clues but in the context of this particular question it is a nuisance, like any other foreground contamination, so they subtract it off the maps obtained from the Wilkinson Microwave Anisotropy Probe (WMAP) in an attempt to get a cleaner map of the primordial CMB sky.

The results are shown in the picture below which presents  the lowest order spherical harmonic modes, the quadrupole (left) and octopole (right) for the  ISW component (top) , WMAP data (middle) and at the bottom we have the cleaned CMB sky (i.e. the middle minus the top). The ISW subtraction doesn’t make a huge difference to the visual appearance of the CMB maps but it is enough to substantially reduce to the statistical significance of at least some of the reported anomalies I mentioned above. This reinforces how careful we have to be in analysing the data before jumping to cosmological conclusions.

peacock

There should also be a further contribution from fluctuations beyond the depth of the 2MASS survey (about 0.3 in redshift).  The actual ISW effect could therefore  be significantly larger than this estimate.

Making the Changes

Posted in Jazz with tags , , , , on September 15, 2009 by telescoper

I often find myself trying to explain to people why I love listening to Jazz. Most people either don’t know much about it or don’t like it at all, especially if it’s “modern”. The trouble is, explaining why it’s so hard to play jazz doesn’t usually make people want to go and listen to it.  “There’s no proper tune”  and  “It’s just noise” are just a couple of the comments I heard in a pub a few weeks ago when somebody put a Miles Davis track on the internet jukebox.

It’s partly a matter of language, of course. The most exquisite Japanese poetry probably sounds like noise to a Westerner who can’t understand the language. When it comes to jazz,  even if you do know a bit about the music you’re by no means guaranteed an easy listening experience. But, played at the highest level, with a driving rhythm section and a star soloist improvising through a constantly shifting pattern of harmonies, there’s no music to match it for sheer white-knuckle intensity.

Far from being “just noise”,   jazz is a tightly disciplined musical form. The freedom given to the soloist to create their own melody comes in fact at a very high price because the melodic line of a jazz solo must constantly recalibrate itself in relationship to the harmonic changes going on beneath it. The chord progression within which the original melody was embedded provides the soloist with the challenge of playing something that fits as well as being new and interesting to listen to.  Usually the actual tune is played only briefly at the start and thereafter becomes pretty much irrelevant until recapitulated at the end of the performance. What really matters to a jazz soloist is not the original melody but the chords.

Each chord establishes a tonal centre and a related scale that  furnishes a reference frame in the space of possible musical notes. When the rest of the band makes the chord changes the soloist must transform to a different coordinate system. The progression of chords as the tune unfolds thus has the effect of pushing and pulling the soloist in different tonal directions. A great jazz solo requires strict adherence to this framework and it imposes tremendous discipline on all the musicians involved.

In a slow 12-bar blues the gravitational effect of the relatively simple chord pattern is especially strong, which is no doubt why it has such a powerfully expressive effect when the soloist plays a “blue note” such as a flattened fifth on top of major scale chords.

In more complicated tunes keeping your place within the constantly shifting harmonic framework is a real challenge, especially if the chord progression is complicated and especially at fast tempi in which the chord changes go flying past at a rate of knots. Such numbers turn into a rollercoaster ride for both musicians and audience.

It’s not just the speed of fingers that makes great soloists so electrifying, but their astonishing mental agility. I remember seeing the great saxophonist Sonny Stitt at Ronnie Scott’s club in London playing the jazz standard How the Moon. Nothing unusual about that because it’s part of the jazz repertoire. The thing was, though, that he played 12 choruses, each one in a different key. How he managed to keep track of everything is completely beyond me. I wasn’t the only one in the audience shaking his head in disbelief.

Giant Steps by John Coltrane is an example I posted a while ago of a supreme piece of high-speed improvisation, and I thought I’d follow it up with this wonderful performance  in which the legendary Charlie Parker plays an extended solo, very fast.

The tune is in fact a variation of a 1930s hit  called Cherokee. Most popular tunes have a 32 bar basic format of the type AABA, with B representing the bridge or middle eight. Cherokee has a similar structure, but is 64 bars long. Its chord progression is both complicated and unusual, with lots of changes to remember especially in the (16-bar) bridge which is fiendishly difficult to play. This makes it fertile ground for improvising on and it quickly became a standard test vehicle for jazz soloists and a yardstick by which saxophonists in particular tended to measure each other’s skill.

During the bebop era it became fairly common practice for musicians to borrow chord sequences from other tunes. Many Charlie Parker pieces, such as Anthropology, are based on the chords from I Got Rhythm for example. There’s a famous story about a recording session involving Charlie Parker during which the band decided to do a version of Cherokee (i.e. using the chord sequence but with a different melody). During the take, however, they absent-mindedly played the actual melody rather than playing something else over the chords. There was a cry of anguish from producer in the control room who had hoped that if they stayed off the actual tune of Cherokee he wouldn’t have to pay composers royalties and the performance ground to a halt.  Shortly after, they did another take, called it Ko-ko and it quickly became a bop classic. This is a later version of Ko-ko, played live, during which Bird runs through the changes like a man possessed. What it must be like to be able to play like this!

Cardiff City 0 Newcastle United 1

Posted in Football with tags , , on September 13, 2009 by telescoper

I spent most of this afternoon at Cardiff City’s new stadium at Ninian Park (which is just over the road from the old one, in fact). The date of the fixture between Cardiff and Newcastle had been in my diary for weeks but by the time I got round to buying tickets it was sold out except for the Premier seats at £65 a go. I decided to go for it anyway and me and my colleague Derek (another astronomer) went in the posh lounge for drinks before during and after the game. I even had the proverbial prawn sandwich. It makes a big difference having food and drink available before and during the match, and although I’d never been in the upmarket part of a football stadium for a match before it’s something I could definitely get used to. In fact the comfort level was a bit more like you would find at the Opera (which I’m off to on Friday as it happens) than a football match.  Although the chorus was not very tuneful I enjoyed their renditions of  Chi è il bastardo in nero and l’arbitro è un coglione.

With seats at the top level of a packed stadium, we had an excellent view of the game. The atmosphere was brilliant – a contrast to the mid-week international I watched in an empty ground a few days ago.

Cardiff City were  either very nervous in front of their first full house or perhaps just stunned by the horrible sight of Newcastle United’s hideous away strip of two-tone yellow stripes, shown on the left modelled by defender Steven Taylor. It took the home side ages to settle, especially their back four who looked jittery throughout the game.

Newcastle were all over Cardiff in the first half and it was no surprise when the away side scored, from a poorly-defended corner which was eventually  put away by Coloccini. Thereafter Cardiff attacked only sporadically. Chopra – an ex-Newcastle player himself – carved one good chance but Rae skied his shot. The Toon were comfortably up 1-0 at half time.

There weren’t many clear-cut chances in the second half, with Newcastle content to sit back and protect their lead keeping the ball as long as possible. This might have been a mistake if Cardiff had managed to put anything together going forward, but their attacks were generally disjointed and lacking penetration. Chopra was the home side’s only real threat but he didn’t show much in the second half. Newcastle’s policy of playing a single striker – the lone Ranger – paid off in this phase. Although he rarely threatened goal himself  he provided an extremely useful channel through which  his defence could clear the ball. Alan Smith (captain for the day) played just in front of the back 4 in a 4-5-1 formation and showed good skill as well as determination.

Cardiff threatened a few times – including a shout for a penalty for handball that was rightly turned down by the official – but didn’t really look like getting an equaliser until, in stoppage time, a slip Coloccini let to a foul by Smith. His second yellow card got him sent off and also left Cardiff with a free kick in a dangerous position just outside the Newcastle penalty area. Nothing came of it, however, and shortly afterwards the referee blew the final whistle. Cardiff’s use of free kicks and other set-pieces was very poor throughout the game, in fact.

I’m biased of course but I think Newcastle thoroughly deserved to win. Nolan, Barton, and Smith were much more composed in midfield than their opposite numbers and Harper, who didn’t have that much to do, looked very solid in goal. Missing Ameobi up front through injury they picked a less adventurous side than perhaps they would have done for a home game.

There weren’t many shots on goal at either end and the only goal came from a set piece, but the game was played at a good tempo and was very enjoyable to watch.

I’d like to mention that the Newcastle fans in the far corner to our right at one point started singing “there’s only one Bobby Robson” in honour of the recently deceased legendary Newcastle and England manager. Cardiff fans all round the ground responded spontaneously with respectful applause. Good stuff.

A beautiful sunny day, a big crowd (25,000+), an excellent game, played in a good sporting atmosphere, and of course the right result. What more could you want? Actually, a few more beers down in Cardiff Bay which we had too.

Newcastle United now have 16 points from 6 games and remain unbeaten at the top of the championship. Cardiff City slip back from 4th place to 8th.

Back Early…

Posted in The Universe and Stuff with tags , , , , , on September 11, 2009 by telescoper

As a very quick postscript to my previous post about the amazing performance of Hubble’s spanking new camera, let me just draw attention to a fresh paper on the ArXiv by Rychard Bouwens and collaborators, which discusses the detection of galaxies with redshifts around 8 in the Hubble Ultra Deep Field (shown below in an earlier image) using WFC3/IR observations that reveal galaxies fainter than the previous detection limits.

Amazing. I remember the days when a redshift z=0.5 was a big deal!

To put this in context and to give some idea of its importance, remember that the redshift z is defined in such a way that 1+z is the factor by which the wavelength of light is stretched out by the expansion of the Universe. Thus, a photon from a galaxy at redshift 8 started out on its journey towards us (or, rather, the Hubble Space Telescope) when the Universe was compressed in all directions relative to its present size by a factor of 9. The average density of stuff then was a factor 93=729 larger, so the Universe was a much more crowded place then compared to what it’s like now.

Translating the redshift into a time is trickier because it requires us to know how the expansion rate of the Universe varies with cosmic epoch. The requires solving the equations of a cosmological model or, more realistically for a Friday afternoon, plugging the numbers into Ned Wright’s famous cosmology calculator.

Using the best-estimate parameters for the current concordance cosmology reveals that at redshift 8, the Universe was only about 0.65 billion years old (i.e. light from the distant galaxies seen by HST set out only 650 million years after the Big Bang). Since the current age of the Universe is about 13.7 billion years (according to the same model), this means that the light Hubble detected set out on its journey towards us an astonishing 13 billion years ago.

More importantly for theories of galaxy formation and evolution, this means that at least some galaxies must have formed very early on, relatively speaking, in the first 5% of the time the Universe has been around for until now.

These observations are by no means certain as the redshifts have been determined only approximately using photometric techniques rather than the more accurate spectroscopic methods, but if they’re correct they could be extremely important.

At the very least they provide even stronger motivation for getting on with the next-generation space telescope, JWST.

Atlantes

Posted in Science Politics, The Universe and Stuff with tags , , , , , , on September 10, 2009 by telescoper

I’ve just noticed a  post on another blog about the  meeting of the Herschel ATLAS consortium that’s  going on in Cardiff at the moment, so I thought I’d do a quickie here too. Actually I’ve only just been accepted into the Consortium so quite a lot of the goings-on are quite new to me.

The Herschel ATLAS (or H-ATLAS for short) is the largest open-time key project involving Herschel. It has been awarded 600 hours of observing time  to survey 550 square degrees of sky in 5 wavelenth bands: 110, 170, 250, 350, & 500 microns. It is hoped to detect approximately 250,000 galaxies,  most of them in the nearby Universe, but some will undoubtedly turn out to be very distant, with redshifts of 3 to 4; these are likely to be very interesting for  studies of galaxy evolution.

Herschel is currently in its performance verification (PV) phase, following which there will be a period of science validation (SV). During the latter the ATLAS team will have access to some observational data to have a quick look to see that it’s  behaving as anticipated. It is planned to publish a special issue of the journal Astronomy & Astrophysics next year that will contain key results from the SV phase, although in the case of ATLAS many of these will probably be quite preliminary because only a small part of the survey area will be sampled during the SV time.

Herschel seems to be doing fine, with the possible exception of the HIFI instrument which is currently switched off owing to a fault in its power supply. There is a backup, but the ESA boffins don’t want to switch it back on and risk further complications until they know why it failed in the first place. The problem with HIFI has led to some rejigging of the schedule for calibrating and testing the other two instruments (SPIRE and PACS) but both of these are otherwise doing well.

The data for H-ATLAS proper hasn’t started arriving yet so the meeting here in Cardiff was intended to sort out the preparations, plan who’s going to do what, and sort out some organisational issues. With well over a hundred members, this project has to think seriously about quite a lot of administrative and logistical matters.

One of the things that struck me as particular difficult is the issue of authorship of science papers. In observational astronomy and cosmology we’re now getting used to the situation that has prevailed in experimental particle physics for some time, namely that even short papers have author lists running into the hundreds. Theorists like me usually work in teams too, but our author lists are, generally speaking, much shorter. In fact I don’t have any publications  yet with more than six or seven authors; mine are often just by me and a PhD student or postdoc.

In a big consortium, the big issue is not so much who to include, but how to give appropriate credit to the different levels of contribution. Those senior scientists who organized and managed the survey are clearly key to its success, but so also are those who work at the coalface and are probably much more junior. In between there are individuals who supply bits and pieces of specialist software or extra comparison data. Nobody can pretend that everyone in a list of 100 authors has made an identical contribution, but how can you measure the differences and how can you indicate them on a publication? Or  shouldn’t you try?

Some suggest that author lists should always be alphabetical, which is fine if you’re “Aarseth” but not if you’re “Zel’dovich”. This policy would, however, benefit “al”, a prolific collaborator who never seems to make it as first author..

When astronomers write grant applications for STFC one of the pieces of information they have to include is a table summarising their publication statistics. The total number of papers written has  to be given, as well as the number in which the applicant  is  the first author on the list,  the implicit assumption being that first authors did more work than the others or that first authors were “leading” the work in some sense.

Since I have a permanent job and  students and postdocs don’t, I always make junior collaborators  first author by default and only vary that policy if there is a specific reason not to. In most cases they have done the lion’s share of the actual work anyway, but even if this is not the case it is  important for them to have first author papers given the widespread presumption that this is a good thing to have on a CV.

With more than 100 authors, and a large number of  collaborators vying for position, the chances are that junior people will just get buried somewhere down the author list unless there is an active policy to protect their interests.

Of course everyone making a significant contribution to a discovery has to be credited, and the metric that has been used for many years to measure scientific productivity is the numbered of authored publications, but it does seem to me that this system must have reached breaking point when author lists run to several pages!

It was all a lot easier in the good old days when there was no data…

PS. Atlas was a titan who was forced to hold the sky  on his shoulders for all eternity. I hope this isn’t expected of members of the ATLAS consortium, none of who are titans anyway (as far as I can tell). The plural of Atlas is Atlantes, by the way.