Let’s talk about polling – how does it work?

Using an overseas polling image to ensure we don’t confuse any NZ graphic with current results.

After the presidential election in the United States last year, trust in the polling profession was rattled, to say the least.

People were asking whether political polling was fundamentally broken. While there’s mounting evidence to show the pre-election errors in 2016 were understandable and even correctable, New Zealand pollsters remain confident such an off the mark result is unlikely to happen here.

Director of UMR Research, Stephen Mills, has worked as a pollster in more than 50 election campaigns in Australia and New Zealand. He says despite accuracy issues overseas, there has “never (yet) been a systemic polling failure in New Zealand or Australian elections”.

But ahead of September’s general election, plenty of Kiwis say they don’t believe in polls. Could that be because they don’t understand how they work? To be fair, it’s a mysterious process.

The process is fairly well understood.  But the public don’t understand how to interpret the results.  


Currently, there are only three public political polls: Roy Morgan, Colmar Brunton, and Reid Research. The Stuff Poll of Polls, developed with help from Massey University’s Professor Malcolm Wright, provides an average of the most recent polls from each of these, with some time-weighting if polls start to get out of date. Parties also carry out internal polls, but the results of those aren’t public — although they are occasionally leaked.

UMR do it for Labour, Curia polls for National and INCITE:Politics.  Then there are some other fringe pollsters.  Horizon has in the past come up with some amazingly good news for parties like Conservative but those results never panned out in reality.  And as we saw the other day, some fake polling is done by insiders hiding behind a company name and then releasing the ‘results’ to try and influence a news cycle.


There are three elements that are critical to getting the polls right, Wright says. The first is just the ability to get a good estimate of what a group of people think, by using a sample. Those samples aren’t perfect, and that’s where a margin of error comes in (more on that later). The second element is what statisticians call “non-sampling error”.

“That’s all the things other than the size of the sample you’ve drawn,” Wright says. “There are biases such as landlines versus mobile phones, the way you have worded the question, the fact you’re doing it online, everything that goes into designing a poll that might affect who you’re targeting, who you’re reaching from that population, their willingness to respond, and so on. The great thing about the Poll of Polls is that it evens out some of those non-sampling errors.”

The third element is turnout, which is partly to blame for recent polling failures in the United Kingdom and the US. This isn’t such a problem in New Zealand, however, where turnout is relatively high — around the mid- to high-70s.

“The real factor to watch this election will be how many young people turn up to vote.”

This is why Facebook likes and Twitter followers don’t translate to actual votes on the day.  The same couch-warriors that will commit both kidneys to the cause won’t actually be motivated to go stand in a line in a dank school hall.

This is why the Green party generally polls higher.  Lot’s of good intention, but when it comes down to it, people either don’t vote or they come to their senses and go with a party that will at least not burn the economy down to get a drop of fresh water.


Possibly. Younger voters traditionally have a lower turnout on the day, so while a higher turnout could make the result harder to poll, it could also make the result more representative of the general population and therefore of the polls. As well as a relatively high turnout overall, New Zealand pollsters have the advantage of a voting system which perfectly replicates the polling process.

“Whereas in the US, for example, knowing the popular vote doesn’t help a big deal. You’ve got to know those key states,” Shoebridge says.

Now the biggie, and the bit most often abused and misunderstood


The margin of error is the pollsters’ way of conveying the level of uncertainty in a poll result. When they give you a result of Labour on 33 per cent, for example, what they are really saying is: we are very confident that the actual support for Labour is somewhere around that mark. Shoebridge says there are a lot of misconceptions about the margin of error.

“If there’s a margin of error of 3.1 per cent, in 19 out of 20 cases … if we went out and surveyed everyone in the population that proportion would be within 3.1 per cent — plus or minus — of what we’re getting within our sample,” he said.

But that doesn’t mean that every party’s result could be out by up to 3.1 per cent.

“I think it’s often misunderstood. That margin of error (3.1 per cent) applies for parties that are polling around 50 per cent. If a party is polling at around 10 per cent then the margin of error is plus or minus 1.9 per cent. What you’ll hear reported is one of the minor parties is polling within the margin of error. You can’t poll within the margin of error … as the percentage gets smaller the margin of error gets smaller as well.”


A look at the past three elections reveals that the pollsters have, collectively at least, done a pretty good job at predicting the outcome. We looked at support for each of the four main parties in the Stuff Poll of Polls right before election day in 2008, 2011, and 2014, and compared that to the actual result. With a couple of exceptions, the Poll of Polls had the support for each party right to within 1 to 1.5 percentage points.

The biggest error in this group of elections came in 2011, when it over-estimated support for National by about 4 points (51.5 per cent versus 47.3 per cent) and under-estimated support for NZ First by 2.5 points (4 per cent versus 6.6 per cent).

That may be something you’d like to dismiss as minor, but 2.5 points to NZ First is more than 35%  more than predicted.  For smaller parties, these 2-3% variances end up being quite significant.  Once again, as the Green party.

In the end polls are useful for trends.   Take the resurgence of Labour with the arrival of Jacinda Ardern.   33%.   And then the water policy was released.   The real problem for Labour is another six weeks of void to fill with something that doesn’t take the gloss off Jacinda’s arrival.

33% is not what Labour will get on election day.  Which is why in the end, there is only one poll that counts.  The one held on 23 September.


– Katie Kenny / Andy Fyers, Stuff

Do you want:

  • Ad-free access?
  • Access to our very popular daily crossword?
  • Access to daily sudoku?
  • Access to Incite Politics magazine articles?
  • Access to podcasts?
  • Access to political polls?

Our subscribers’ financial support is the reason why we have been able to offer our latest service; Audio blogs. 

Click Here  to support us and watch the number of services grow.

As much at home writing editorials as being the subject of them, Cam has won awards, including the Canon Media Award for his work on the Len Brown/Bevan Chuang story. When he’s not creating the news, he tends to be in it, with protagonists using the courts, media and social media to deliver financial as well as death threats.

They say that news is something that someone, somewhere, wants kept quiet. Cam Slater doesn’t do quiet and, as a result, he is a polarising, controversial but highly effective journalist who takes no prisoners.

He is fearless in his pursuit of a story.

Love him or loathe him, you can’t ignore him.