Lifting the Margin of Error Safety Blanket
The trend is still your friend
>
“All changes are within the margin of errorâ€. It’s a frequently heard line, usually trotted out by those sympathetic to a party whose share has just declined, or by those keen to fence-sit. While it may be true (and it usually is: only six of the 600 or so poll-to-poll changes across the four parties in this year’s YouGov series have been more than 3%, for example, and then only by a single point), its attraction is also dangerously deceptive.
In fact, it’s even easier to be comforted by the margin of error: even those 4% movements could easily be within it (for a 3% MoE poll), if they went from, say, 2% above the true position to 2% below it. Add in that the various party scores are related variables – one party’s gain must be another’s loss – and it’s possible for one party’s lead over another to move by four times the quoted MoE yet not be rogue.
All of which makes it very easy to explain away disagreeable results as potentially simply down to sampling, because they usually might be. However, the truth and the whole truth are very different beasts, which is what makes the explanation both attractive and misleading.
While the statement may of itself be true, it implies more than it actually says: that the changes are unreliable and should be largely ignored. In fact, assuming consistent polling standards, even a 1% move one way or the other is more likely than not to represent a genuine shift (admittedly, not by very much), and probabilities matter. More importantly, because in these days of frequent polling it’s rare for any one poll to register a large change – and when they do, they often are outliers that revert at the next poll – the cumulative effect becomes under-emphasised.
Yet it’s the cumulative trend which is important: the slow drift upwards or downwards over several weeks that can produce enough of a swing to make a real shift. It’s something I noticed when looking at the polls from this time last parliament. In August 2009, the Conservatives had a typical lead of around 16% over Labour; by March 2010, it was down to mid-single figures yet there was no step-change moment, just a gradual but steady drift all the way from landslide to hung parliament territory.
So how do we sift the statistical wheat from the chaff? Real changes in VI don’t happen in isolation; they’re the reaction to events that should be known to an interested observer, so with the knowledge of the one, the other can be anticipated with good judgement. Essentially, the headline figures are more corroboration than forward-looking; the diamonds are in the detail. On the other hand, headline movements that don’t accord with underlying opinions and perceptions or with real-world events are usually at best soft or at worst just noise.
David Herdson
Update – on a similar theme, the changes in the Scottish referendum polling are significant. Not only does the movement easily explained by a real event, but the figures down the survey suggest Yes will have great difficulty pulling things round. While Alistair Darling’s performance had a rating of +32 in terms of knowledgeable/uninformed, Alex Salmond’s was -4. Similarly, Darling’s rating on honest/dishonest was +4 against -7 for Salmond. Once voters decide you don’t know what you’re talking about and are effectively just blustering, you’re in trouble. With over three-quarters of voters expressing an opinion wanting to keep a link to Sterling in some form, expect No to keep pounding Salmond on the currency question, which will see it through.