Archive for Sherlock Holmes

The Return of the Inductive Detective

Posted in Bad Statistics, Literature, The Universe and Stuff with tags , , , , , , , , on August 23, 2012 by telescoper

A few days ago an article appeared on the BBC website that discussed the enduring appeal of Sherlock Holmes and related this to the processes involved in solving puzzles. That piece makes a number of points I’ve made before, so I thought I’d update and recycle my previous post on that theme. The main reason for doing so is that it gives me yet another chance to pay homage to the brilliant Jeremy Brett who, in my opinion, is unsurpassed in the role of Sherlock Holmes. It also allows me to return to a philosophical theme I visited earlier this week.

One of the  things that fascinates me about detective stories (of which I am an avid reader) is how often they use the word “deduction” to describe the logical methods involved in solving a crime. As a matter of fact, what Holmes generally uses is not really deduction at all, but inference (a process which is predominantly inductive).

In deductive reasoning, one tries to tease out the logical consequences of a premise; the resulting conclusions are, generally speaking, more specific than the premise. “If these are the general rules, what are the consequences for this particular situation?” is the kind of question one can answer using deduction.

The kind of reasoning of reasoning Holmes employs, however, is essentially opposite to this. The  question being answered is of the form: “From a particular set of observations, what can we infer about the more general circumstances that relating to them?”.

And for a dramatic illustration of the process of inference, you can see it acted out by the great Jeremy Brett in the first four minutes or so of this clip from the classic Granada TV adaptation of The Hound of the Baskervilles:

I think it’s pretty clear in this case that what’s going on here is a process of inference (i.e. inductive rather than deductive reasoning). It’s also pretty clear, at least to me, that Jeremy Brett’s acting in that scene is utterly superb.

I’m probably labouring the distinction between induction and deduction, but the main purpose doing so is that a great deal of science is fundamentally inferential and, as a consequence, it entails dealing with inferences (or guesses or conjectures) that are inherently uncertain as to their application to real facts. Dealing with these uncertain aspects requires a more general kind of logic than the  simple Boolean form employed in deductive reasoning. This side of the scientific method is sadly neglected in most approaches to science education.

In physics, the attitude is usually to establish the rules (“the laws of physics”) as axioms (though perhaps giving some experimental justification). Students are then taught to solve problems which generally involve working out particular consequences of these laws. This is all deductive. I’ve got nothing against this as it is what a great deal of theoretical research in physics is actually like, it forms an essential part of the training of an physicist.

However, one of the aims of physics – especially fundamental physics – is to try to establish what the laws of nature actually are from observations of particular outcomes. It would be simplistic to say that this was entirely inductive in character. Sometimes deduction plays an important role in scientific discoveries. For example,  Albert Einstein deduced his Special Theory of Relativity from a postulate that the speed of light was constant for all observers in uniform relative motion. However, the motivation for this entire chain of reasoning arose from previous studies of eletromagnetism which involved a complicated interplay between experiment and theory that eventually led to Maxwell’s equations. Deduction and induction are both involved at some level in a kind of dialectical relationship.

The synthesis of the two approaches requires an evaluation of the evidence the data provides concerning the different theories. This evidence is rarely conclusive, so  a wider range of logical possibilities than “true” or “false” needs to be accommodated. Fortunately, there is a quantitative and logically rigorous way of doing this. It is called Bayesian probability. In this way of reasoning,  the probability (a number between 0 and 1 attached to a hypothesis, model, or anything that can be described as a logical proposition of some sort) represents the extent to which a given set of data supports the given hypothesis.  The calculus of probabilities only reduces to Boolean algebra when the probabilities of all hypothesese involved are either unity (certainly true) or zero (certainly false). In between “true” and “false” there are varying degrees of “uncertain” represented by a number between 0 and 1, i.e. the probability.

Overlooking the importance of inductive reasoning has led to numerous pathological developments that have hindered the growth of science. One example is the widespread and remarkably naive devotion that many scientists have towards the philosophy of the anti-inductivist Karl Popper; his doctrine of falsifiability has led to an unhealthy neglect of  an essential fact of probabilistic reasoning, namely that data can make theories more probable. More generally, the rise of the empiricist philosophical tradition that stems from David Hume (another anti-inductivist) spawned the frequentist conception of probability, with its regrettable legacy of confusion and irrationality.

In fact Sherlock Holmes himself explicitly recognizes the importance of inference and rejects the one-sided doctrine of falsification. Here he is in The Adventure of the Cardboard Box (the emphasis is mine):

Let me run over the principal steps. We approached the case, you remember, with an absolutely blank mind, which is always an advantage. We had formed no theories. We were simply there to observe and to draw inferences from our observations. What did we see first? A very placid and respectable lady, who seemed quite innocent of any secret, and a portrait which showed me that she had two younger sisters. It instantly flashed across my mind that the box might have been meant for one of these. I set the idea aside as one which could be disproved or confirmed at our leisure.

My own field of cosmology provides the largest-scale illustration of this process in action. Theorists make postulates about the contents of the Universe and the laws that describe it and try to calculate what measurable consequences their ideas might have. Observers make measurements as best they can, but these are inevitably restricted in number and accuracy by technical considerations. Over the years, theoretical cosmologists deductively explored the possible ways Einstein’s General Theory of Relativity could be applied to the cosmos at large. Eventually a family of theoretical models was constructed, each of which could, in principle, describe a universe with the same basic properties as ours. But determining which, if any, of these models applied to the real thing required more detailed data.  For example, observations of the properties of individual galaxies led to the inferred presence of cosmologically important quantities of  dark matter. Inference also played a key role in establishing the existence of dark energy as a major part of the overall energy budget of the Universe. The result is now that we have now arrived at a standard model of cosmology which accounts pretty well for most relevant data.

Nothing is certain, of course, and this model may well turn out to be flawed in important ways. All the best detective stories have twists in which the favoured theory turns out to be wrong. But although the puzzle isn’t exactly solved, we’ve got good reasons for thinking we’re nearer to at least some of the answers than we were 20 years ago.

I think Sherlock Holmes would have approved.

Holmes for the Bewildered

Posted in Literature, Television with tags , , , , on January 9, 2012 by telescoper

Being back to work full-time, now that the new teaching term has started, I find myself in a position to do quick lunchtime blog post while I eat my sandwich. I was going to blog about this topic last week, but thought I’d wait a week in case anything happened to change my negative opinion on this issue. I’m aware that I’m in a small minority and didn’t want to expose myself to public disapproval without due care and attention. Well, last night my opinion certainly changed, only it got even more negative. So now I’m going to take a deep breath, gird my loins, and state for the record my honestly-held opinion that the new BBC TV Series Sherlock is complete and utter tripe.

It’s not that I object to the idea of  placing Sir Arthur Conan Doyle’s great stories in a contemporary setting. Not at all. Sherlock Holmes is one of the most memorable creations in all of fiction and the plots – at least most of them – are so well constructed that the stories should be translatable into a contemporary setting quite easily. There have been so many “traditional” versions of  Sherlock Holmes that I welcome the attempt to do something different with the character.

Neither is it that I object to Sherlock Holmes being played for laughs. The character does indeed possess a great deal of comic potential, which  a number of other interpretations have exploited with a greater or lesser degree of success.

What has happened in this series, however, is that the original plots have been butchered to the point where they make no sense at all. Instead we just have a series of thinly related comedy sketches, with only feeble attempts to link them to a viable mystery story, like a duff combination of the worst bits of Jonathan Creek and The Fast Show.

Last night’s puerile Hound of the Baskervilles was especially dire in this respect. The original story – a full-length novel rather than a short story – is a genuinely intriguing mystery-thriller, laced with undertones of the supernatural, and full of memorable characters, including of course the fearsome Hound itself.

For reasons best known to themselves Forced to squeeze it into one hour, the producers of last night’s version of this classic tale abandoned most of the original plot and introduced a load of silly nonsense about werewolves and hallucinogenic fog and the CIA. The Holmes-Watson double-act was quite amusing – and some of the dialogue very witty – but the plot was so thin it just reminded me of Abbott and Costello meet the Wolfman and other such films I watched when I was a kid. I thought the first episode -  A Scandal in Belgravia – was bad enough, but last night’s episode was truly excruciating. I won’t be watching any more.

It’s a mystery to me why so many people seem to think this tosh is so good, but then I’m used to being in a minority of one. Perhaps if you watch a lot of TV your expectations are lowered so much by the constant stream of drivel that anything that even tries to be original – which Sherlock admittedly does – sends you into raptures?

No, dear critics, I don’t think Sherlock is “great TV” at all. In fact I think it’s dreadful.

There. I’ve said it.

The Inductive Detective

Posted in Bad Statistics, Literature, The Universe and Stuff with tags , , , , , , , on September 4, 2009 by telescoper

I was watching an old episode of Sherlock Holmes last night – from the classic  Granada TV series featuring Jeremy Brett’s brilliant (and splendidly camp) portrayal of the eponymous detective. One of the  things that fascinates me about these and other detective stories is how often they use the word “deduction” to describe the logical methods involved in solving a crime.

As a matter of fact, what Holmes generally uses is not really deduction at all, but inference (a process which is predominantly inductive).

In deductive reasoning, one tries to tease out the logical consequences of a premise; the resulting conclusions are, generally speaking, more specific than the premise. “If these are the general rules, what are the consequences for this particular situation?” is the kind of question one can answer using deduction.

The kind of reasoning of reasoning Holmes employs, however, is essentially opposite to this. The  question being answered is of the form: “From a particular set of observations, what can we infer about the more general circumstances that relating to them?”. The following example from a Study in Scarlet is exactly of this type:

From a drop of water a logician could infer the possibility of an Atlantic or a Niagara without having seen or heard of one or the other.

The word “possibility” makes it clear that no certainty is attached to the actual existence of either the Atlantic or Niagara, but the implication is that observations of (and perhaps experiments on) a single water drop could allow one to infer sufficient of the general properties of water in order to use them to deduce the possible existence of other phenomena. The fundamental process is inductive rather than deductive, although deductions do play a role once general rules have been established.

In the example quoted there is  an inductive step between the water drop and the general physical and chemical properties of water and then a deductive step that shows that these laws could describe the Atlantic Ocean. Deduction involves going from theoretical axioms to observations whereas induction  is the reverse process.

I’m probably labouring this distinction, but the main point of doing so is that a great deal of science is fundamentally inferential and, as a consequence, it entails dealing with inferences (or guesses or conjectures) that are inherently uncertain as to their application to real facts. Dealing with these uncertain aspects requires a more general kind of logic than the  simple Boolean form employed in deductive reasoning. This side of the scientific method is sadly neglected in most approaches to science education.

In physics, the attitude is usually to establish the rules (“the laws of physics”) as axioms (though perhaps giving some experimental justification). Students are then taught to solve problems which generally involve working out particular consequences of these laws. This is all deductive. I’ve got nothing against this as it is what a great deal of theoretical research in physics is actually like, it forms an essential part of the training of an physicist.

However, one of the aims of physics – especially fundamental physics – is to try to establish what the laws of nature actually are from observations of particular outcomes. It would be simplistic to say that this was entirely inductive in character. Sometimes deduction plays an important role in scientific discoveries. For example,  Albert Einstein deduced his Special Theory of Relativity from a postulate that the speed of light was constant for all observers in uniform relative motion. However, the motivation for this entire chain of reasoning arose from previous studies of eletromagnetism which involved a complicated interplay between experiment and theory that eventually led to Maxwell’s equations. Deduction and induction are both involved at some level in a kind of dialectical relationship.

The synthesis of the two approaches requires an evaluation of the evidence the data provides concerning the different theories. This evidence is rarely conclusive, so  a wider range of logical possibilities than “true” or “false” needs to be accommodated. Fortunately, there is a quantitative and logically rigorous way of doing this. It is called Bayesian probability. In this way of reasoning,  the probability (a number between 0 and 1 attached to a hypothesis, model, or anything that can be described as a logical proposition of some sort) represents the extent to which a given set of data supports the given hypothesis.  The calculus of probabilities only reduces to Boolean algebra when the probabilities of all hypothesese involved are either unity (certainly true) or zero (certainly false). In between “true” and “false” there are varying degrees of “uncertain” represented by a number between 0 and 1, i.e. the probability.

Overlooking the importance of inductive reasoning has led to numerous pathological developments that have hindered the growth of science. One example is the widespread and remarkably naive devotion that many scientists have towards the philosophy of the anti-inductivist Karl Popper; his doctrine of falsifiability has led to an unhealthy neglect of  an essential fact of probabilistic reasoning, namely that data can make theories more probable. More generally, the rise of the empiricist philosophical tradition that stems from David Hume (another anti-inductivist) spawned the frequentist conception of probability, with its regrettable legacy of confusion and irrationality.

My own field of cosmology provides the largest-scale illustration of this process in action. Theorists make postulates about the contents of the Universe and the laws that describe it and try to calculate what measurable consequences their ideas might have. Observers make measurements as best they can, but these are inevitably restricted in number and accuracy by technical considerations. Over the years, theoretical cosmologists deductively explored the possible ways Einstein’s General Theory of Relativity could be applied to the cosmos at large. Eventually a family of theoretical models was constructed, each of which could, in principle, describe a universe with the same basic properties as ours. But determining which, if any, of these models applied to the real thing required more detailed data.  For example, observations of the properties of individual galaxies led to the inferred presence of cosmologically important quantities of  dark matter. Inference also played a key role in establishing the existence of dark energy as a major part of the overall energy budget of the Universe. The result is now that we have now arrived at a standard model of cosmology which accounts pretty well for most relevant data.

Nothing is certain, of course, and this model may well turn out to be flawed in important ways. All the best detective stories have twists in which the favoured theory turns out to be wrong. But although the puzzle isn’t exactly solved, we’ve got good reasons for thinking we’re nearer to at least some of the answers than we were 20 years ago.

I think Sherlock Holmes would have approved.

Follow

Get every new post delivered to your Inbox.

Join 3,284 other followers