Can chance make you a killer?
Can chance make you a killer? In his regular column, Michael Blastland invites you to try the deadly Go Figure Chance Calculator.
Imagine you are a hospital doctor. Some patients die. But how many is too many before you or your hospital are labelled killers?
We've devised a chance calculator to simulate this scenario. It is set up so that you are innocent of any failing. But bad luck might convict you all the same.
In the real world all kinds of factors make a difference, like surgical skill. But in the calculator, every patient in every hospital has exactly the same chance of dying and every surgeon is equally good. This is to show what chance alone can do, even when the odds are the same all round.
- The calculator (below) shows 100 hospitals each performing 100 operations
- The probability that a patient dies is initially fixed at five in 100
- The government, meanwhile, says death rates 60% worse than the norm are unacceptable (in red)
- So any hospital which has eight deaths or more out of 100 ops - when the expected average is only five - is in trouble.
- We've assigned one hospital to you, with a box around it - it could come out green or red.
Start the calculator by clicking on the slider itself and see whether your hospital - the boxed one - is safe or deadly. Click "recalculate", in the lower right-hand corner of the module, a few times to see how you and others do.Continue reading the main story
Here's what happened when I ran the calculator a few times.
First, some hospitals look dodgy, others brilliant. In this example (see image, right), one surgeon or hospital had 14 deaths (that's the red H out on its own beneath the big cluster), 1,300 per cent higher mortality than some others, who had only one, a huge disparity. Mine - boxed - was one of the unacceptables. So sack me.
But remember, these results are pure chance, computer-generated, based on exactly the same risk for every patient. So hospitals are not really good or bad, it's just chance, lucky or unlucky.
That sounds odd. The calculator seems to show fatal incompetence or maybe even - let's speculate what goes through the public mind - murder at one, medical genius at another.
Keep recalculating and sometimes only a few are unacceptable.
The example above left has five "bad" surgeons. Roll the dice again and it comes up with a shocking 20 that failed to meet the standard.
Next, move the slider up to, say, a 12% death rate. This imagines a more dangerous operation. But now there are fewer unacceptables as there tends to be relatively less variation around bigger numbers.
Finally, move the slider right down to a 1% death rate. Now, still using the 60% threshold, a huge number of hospitals are often unacceptable. That's because there tends to be relatively more variation around smaller numbers.
The same applies to the number of ops performed. Do more, and the variation tends to be relatively smaller. Do only a few and the results are more likely to be - relatively - all over the shop. The government says it would like to publish results right down to the level of the individual surgeon.Chance v skill
Note that chance does not mean without cause. Every death has a cause, but sometimes these come together more often in certain places at certain times in ways that have nothing to do with anything we know or that can currently be known about the patient or the surgeon.
Think of it like this...
Think of a bag of 100 balls, five of them red. Pull a red ball from the bag and it means a death. What if you pull 100 balls from the bag, each time putting the ball back?
Your chance - and it is only chance - of pulling a deadly ball is 5 in 100, or one in 20, or 5%. But it's easy now to imagine that you might draw 14 red balls or more in 100 attempts, or none, purely by chance.
This is the same as the simulation in our calculator; every time you run it, it is like imagining that 100 hospitals dip into the bag 100 times each.
Does all this mean that every hospital mortality rate is pure chance? Of course not. There can be what's called special-cause variation - in contrast to the common-cause variation in our calculator. Special-cause variation is when something like surgical skill is the real reason for 15 deaths. The big problem - bigger than many people think - is how to tell the difference.
So what about a hospital like Mid Staffs, where hundreds are thought to have died unnecessarily? That's an example - probably - of special-cause variation, not chance. There's to be more of this kind of measurement as the government says the effort must shift from setting targets for how much work is done to judging how well it's done. That probably means more emphasis on counting bodies.
In practice, regulators who make calls about standards perform many hard judgments and calculations to work out what's chance and what's not.
But because chance has such a variable effect, they will often be unsure, even about the kind of huge differences we see in the calculator. Anyone thinking raw data is all we need, beware.
Before naming the good or bad, we need to understand the extent to which chance can make the innocent appear dangerous and make heroes of the ordinary.
Note: For a great but techie illustration of how hard it can be to determine whether a pattern of results tells us anything, there is a fascinating exchange on Ben Goldacre's other blog.