What can doctors learn from pilots and cyclists?
Copying the aviation industry and even professional cycling may help the NHS save lives, writes Matthew Syed.
7,500 people are estimated to die in English hospitals every year due to avoidable mistakes.
One of these was Joshua Titcombe, who passed away at Furness General Hospital a few days after he was born in 2008 because clinicians didn't spot clear warning signs of major distress, including fast breathing, grunting and a mother who had collapsed from infection moments after the birth.
Little Joshua was cocooned in wires and tubes during his short life.
James and Hoa, his parents, would place a finger into his little hand, and he would grip it tightly. But it was already too late.
As life drained from his body, James and Hoa took it in turns to kiss him, before saying goodbye for the last time.
On his headstone, they wrote: "Our little fighter, always remembered".
Each statistic is a tragedy.
A life lost, a family suffering, loved ones bereaved. But each avoidable death is something else, too: a precious learning opportunity.
Many of the deaths suffered in hospitals have patterns, what accident investigators call "signatures".
With open reporting and honest evaluation, new procedures could be put in place to ensure that the same mistake never happens again. But all too often, that's not the case.
Five years before Joshua died, newborn Elleanor Bennett passed away at the same hospital, after staff failed to monitor her heart rate properly.
The consultant told the parents "it was just one of those things" and no independent investigation took place.
When James Titcombe lost his son, he was met with similar evasions. Only because Mr Titcombe campaigned for more information, did he find answers.
The Kirkup report of 2015 found that 11 babies and a mother died in a 'lethal combination of failures' at every level of the system between 2004 and 2013.
Had the lessons been learned earlier, these lives could have been saved.
Today, a new approach to learning will come into legal effect in the NHS, with the establishment of an independent accident investigation branch.
It is modelled on aviation, an industry which has a very different response to failure.
Lessons from 30,000 feet
When two planes almost collide in mid-air - a near-miss event - both pilots can submit a report.
These reports are statistically analysed to learn the lessons to prevent an accident before it has even happened.
And when a crash does happen, there are two almost indestructible black boxes (data recorders) that can be probed to learn the lessons.
In 1978, for example, a plane crashed in suburban Portland, Oregon, after the pilot had become focused on a malfunctioning light bulb, but lost focus on dwindling fuel reserves, which ultimately caused the accident.
On the surface, it looked like human error, but the black box investigation found uncanny similarities with other accidents.
It turned out that this was not a problem with the specific pilot, but with the limits of human perception.
New protocols were immediately introduced that addressed this problem.
These were designed to create a more structured division of responsibilities amongst the crew so that they could not become fixated on the wrong problem, and to improve communication.
What happened? Accidents of this kind disappeared overnight.
Ten people died on the Portland flight, but the learning opportunity saved many thousands more.
Today, aviation is arguably the safest form of transportation. Last year the accident rate had dropped to a low of only four fatal crashes from a total of 37.6 million flights.
Independent investigation is at the heart of this process. Professionals are given every reason to cooperate, because they know that investigations are not about finding scapegoats, but learning lessons.
Indeed, professionals are given a legal safe space so they speak up honestly, and can be penalised only where negligence or malevolence is proven.
Learn the lesson
Independent investigation and safe space protection are equally vital in healthcare.
Staff must be assured that when mistakes are caused by defective processes, they can speak up without being scapegoated.
Only by combating the "blame culture" in the NHS can transparency and meaningful change take place.
Hospitals that have developed a culture of open reporting have produced outstanding results.
The number of malpractice claims against the University of Illinois Medical Center, for example, fell by half in two years.
Virginia Mason, a hospital in Seattle, has seen a fall in insurance liability premiums of a staggering 74%.
As Gary Kaplan, its chief executive, put it: "We have a system that learns the lessons so that we can turn weaknesses into strengths".
This is about more than healthcare and aviation, however.
Consider, for example, how the success of British sport has been driven by a similar commitment to continual improvement.
In cycling, for example, there is a constant emphasis on discovering weaknesses so they can be turned into strengths.
When Sir Dave Brailsford became head of British cycling, he was driven by insatiable curiosity to improve every single process.
So, he tested the bike design in a wind tunnel and tweaked it for a gain in aerodynamic efficiency.
He transported mattresses from stage to stage during the Tour de France for a marginal improvement in sleep quality.
He had the team use antibacterial hand gel to cut down on the risk of infection. The accumulation of these "marginal gains" has turned British cycling into the envy of the world.
If these sound like trivial issues, consider that until recently up to 60,000 deaths were caused annually in the United States by central line infections.
Only through proper investigation was it discovered that a key factor was clinicians failing to put sterile dressings over the catheter site; the medical equivalent of not using antibacterial hand gel.
The introduction of a five-point checklist - a marginal change - saved 1,500 lives over 18 months in the state of Michigan alone.
As a writer, I have seen the importance of a progressive culture again and again.
Indeed, I wrote a book on the subject and, as a consequence, was invited by the Secretary of State for Health to help draft the proposals that come into force today.
The vast majority of NHS staff are diligent and heroic and have been long let down by a culture of blame.
Independent investigation is a crucial first step to drive down the avoidable errors that kill 150 people every week.
Matthew Syed is the author of Black Box Thinking: The Surprising Truth About Success