What went wrong with the polling? More starts to emerge but few answers

What went wrong with the polling? More starts to emerge but few answers

In this News Statesman article James Morris, from Labour’s US-based pollster GQR, explain how its approach was different and is likely to produce far fewer don’t knows which, it is argued, add to accuracy. He also notes that I was one of those interviewed.

“The main difference between our polls and the newspaper polls is that we don’t ask the voting intention first. As Politicalbetting.com’s Mike Smithson found out when he accidentally participated in our only telephone poll of the last 4 years, we first ask respondents to think about the country, the economy, their top issues, the parties and the leaders. We think it gets them closer to their ballot box mindset.

This technique delivers a much lower don’t know number – generally half the level found in the public polls. We treat this ‘don’t know’ group differently to most of the public polls, asking them questions about who they are likely to vote for rather than assuming they are likely to vote for whoever they voted for last time. Of course, that requires many more questions and so is more expensive to implement especially for a phone pollster where every minute costs money. If we had run a final poll close to election day, would we have got the Tory margin right? It’s hard to know. But if this explanation is broadly true, it means the drift to online polling remains valid..”

Morris was one of the pollsters featuring in last night’s Newsnight examination of why the figures were so adrift. This is well worth watching.

Survation, which chickened out of publishing its final phone poll which had a 6% CON lead, is arguing that the polls weren’t wrong – there was just a very late swing. I think there might be something in that but none of the other pollsters found that movement in the closing hours last week.

It is hard to reach any conclusions on this. Getting representative samples is clearly an increasing challenge. Are those whom the phone pollsters manage to reach and are then ready to participate really representative of the electorate as a whole? As for the online firms does the fact that their numbers come from people who have volunteered to be part of a panel make them unrepresentative?

It was odd after the first debate that the online polls found that more in its samples were saying they watched than the official viewing figures suggested.

As we saw with the polling performances in Israel a few weeks back and at some recent US elections the challenge is not just a UK one.

Mike Smithson


Comments are closed.