Reality Check: Should we give up on election polling?
The presidential election result came as a shock to people who had been looking for guidance from opinion polls. Just like at the 2015 UK general election and the EU referendum, the outcome was different from what most polls had suggested. So should we give up on them all together?
First, we should look at how badly the polls got it wrong. That's quite complicated because there were hundreds of polls even in the last few weeks of the campaign. We also need to look at both the national polls and those which were conducted in individual states.
On average, they overestimated Hillary Clinton's vote and underestimated Donald Trump's. However, in national polls the error was not particularly large. The final polls from the main news networks suggested a modest Clinton lead.
We don't have the final official vote tally yet but, based on what's been counted so far, Clinton did score a narrow win in the national popular vote. And with many of the votes still to count coming from strongly-Democrat California, it's plausible she will end up with a lead of more than 1%.
So, the national polls were not all that far off in terms of total votes. What they got spectacularly wrong was the overall outcome. A 4% popular vote lead for Clinton would almost certainly have meant she would have won the electoral college votes to win the presidency as well.
Also, among the bigger firms there was quite a strong consensus, which made the surprise of the result all the greater.
There were some outliers, such as the LA Times/USC tracking polls, the last poll of which put Trump ahead by 3.2 percentage points, but they overestimated support for Trump, so it wasn't a resounding success for them either.
How Donald Trump won the presidency
At the state level, some of the errors were much bigger. None of the polls had suggested that Trump would take Wisconsin - most gave Clinton a comfortable lead. And, while many had suggested Trump would win in Ohio, the scale of his victory there had not been predicted.
But even at state level the extent of the error should not be exaggerated. Final polls in Florida suggested a virtual dead heat - Trump won the state by 1%. In Virginia, Nevada and Colorado the polls were pretty accurate too.
Overall, though, it was not a good showing.
Three times wrong
If getting one election wrong could be considered a misfortune, and getting two wrong looks like carelessness, what does it say about the polling industry that this was the third failure at a major election within a period of 18 months?
The polling inquiry that followed the 2015 UK general election found that the main problem there had been unrepresentative samples. In other words, the people who agreed to take part in polls did not reflect the population as a whole, even if they looked the same in terms of age, gender, and class background.
The case of the EU referendum was a bit different. Online polls throughout the campaign pointed to a very close result, and many of them in the final weeks suggested that Leave was ahead. But most telephone polls pointed to a Remain victory, so the overall picture was unclear.
We don't know yet what went wrong with the US polls, but it's possible that they're suffering from a similar systematic problem to that uncovered after the UK general election. Certainly, phone pollsters in both countries face an increasing challenge in getting people to participate.
Perhaps the reason for the failure doesn't matter though. If the polls keep getting it wrong, why should we bother looking at them at all?
It's the outcome, stupid
It's interesting to compare the recent failures with what happened at the 2014 referendum on Scottish independence. In that case almost all of the polls suggested that No would win and that Scotland would stay in the United Kingdom.
They got the outcome right, but that doesn't mean they were accurate. The final polls conducted by each of the major companies gave No a lead of between +4 and +7, suggesting a fairly close contest. The final result was that No won by a clear 10.6%.
There were some complaints at the time that misleadingly close polls had led the UK government to make more significant promises about devolution than they would otherwise have done, but those complaints were aimed at the handful of "rogue" polls that put Yes in the lead. Very few people suggested there had been a massive polling failure.
What this shows is that if the polls suggest the correct outcome, small or moderate inaccuracies are barely noticed. But when they get the outcome wrong it's considered a disaster even if, numerically speaking, they weren't a million miles away.
That's perfectly natural of course. We're interested in polls because we hope they can help us predict what will happen. When they get the headline wrong they make us feel like fools.
There's probably another factor at work too. In both the EU referendum and the US election, the failure of the polls was exacerbated by the fact that the outcome also confounded the common assumption that voters would end up going for the "safe" option - the one that implied continuity, rather than significant change.
Should we give up on polls?
Despite the sequence of failures, it seems unlikely that people will stop commissioning polls. For all their problems, it's hard to think of how else we could get any idea of how people are planning to vote.
Perhaps one lesson we should take away is that the most we should hope of polls is that they're in the right ballpark - even when lots of them agree. So in close elections we should assume that the only thing polls can really tell us is that the election is indeed close - not that one candidate, or one side, is going to win.
It's worth pausing here to reflect on just how close the US result was. If a little under 1% of voters in Pennsylvania and Florida had gone for Clinton instead of Trump - about 100,000 of the 120 million who took part - it would have been she who won.
Had that happened there would probably be a lot less talk about a polling disaster.