Nate Silver became an overnight sensation as a polling guru early on in the 2008 Presidential when he contradicted the conventional media interpretation of the race for the Democratic nomination. The pundits were promoting the inevitability of a Hillary Clinton nomination victory. An early supporter of Barack Obama, Silver shrewdly distinguished between good polls and bad, and early on anticipated the Obama presidency. He went on to write a widely-read book, The Signal and the Noise, about statistical probability and prediction.
What we see in Canada is desperation to figure out where the election race is headed with endless parsing of the smallest nuance in the latest poll. There is an abundance of statistical and media noise, but little in the way of clear signals about where the overall election race is headed.
As the Oct. 19 election date approaches, the outcome will likely become clear enough. Meanwhile the media should be patient and allow the public to weigh their choices. They could assist by emphasizing and clarifying policy choices facing the electorate, as well as asking some tough questions about how the polls are conducted.
Instead as we see a headline across the top of an inside page in in the Globe and Mail, “In Quebec, Trudeau Aims to Connect”, a piece of poll-driven mush (the NDP are slipping/ is Trudeau catching up?), while consigning an Employment Insurance announcement from Thomas Mulcair to a capsule at the bottom of the page under the heading: Election Digest. Unemployment and the impact of EI premiums on the economy, is that an issue?
Meaningless analysis of political psychology compared to a key policy announcement. The editorial choice is clear.
This week we have completely contradictory polls from two of Canada’s most respected polling companies. EKOS Research, which just released a survey giving the Conservatives a clear lead. The daily Nanos Poll releases the past two days suggest a completely different picture, a continuing tight race with NDP second, the Conservatives third. A new Forum poll says no to both of those – the Liberals and Conservatives are tied, the NDP is in third. You get the picture.
Polling is having a hard time these days. The media, with news budgets cut by internet competition, won’t pay for much of the polling that is reported (Ekos/iPolitics and Nanos/CTV are exceptions). However, most pollsters pay for and release their own polls simply as a means of promoting their market research business, so there is no shortage.
The media helps out by reporting every number available, mostly without context, but the polls are suffering from a lack of respondents. Response rates to polls as Nate Silver puts it “are dismal these days”. The growing public resistance to answering poll questions may partly explain the polling fiascos we have seen in recent years.
A good example was the 2013 B.C. election where polls and the media anticipated a significant NDP victory. Instead majority victory went to Premier Christy Clark and her B.C. Liberals. Another part of it may be new polling technologies and methods.
The polls were wrong in this year’s UK election. As Nate Silver noted “polls of the U.K. election — most of them conducted online — projected a photo-finish for Parliament instead of a Conservative majority“.
Polling technologies and methods ought to be placed under the media microscope – instead their results are treated as just another poll. One new technique is a variation on the traditional telephone poll. Interactive voice response (IVR) surveys – essentially computerized robots – dial your number and ask questions in an electronic voice. This technique is used by Forum and EKOS among others.
There are several online pollsters (including Leger Marketing , Ipsos and Angus Reid) that conduct surveys via the internet with samples drawn from previously recruited panels, which can number into the hundreds of thousands. The panel is supposed to represent Canada as a whole and act as a substitute for randomly selecting respondents from the whole population. The media should be asking how big the national panel is, and how big are the panels in the provinces and regions reported in the polls.
Are these smaller panels an adequate substitute for the population of smaller regions and provinces? What matters in determining seat counts is regional support so the issue is important. We should also know exactly how these polls are conducted.
What do their reported margins of error mean? Surely all the margin of error tells us is how confident we can be the sample is representative of the panel, but is this really comparable to a telephone poll that could call any landline or cellphone anywhere in population. We should also get information and analysis from the media on whether polling companies contact both cellphones vs landlines and what that means.
Some polling firms are more transparent about their methods than others, but there is nothing to stop pundits and media from asking pointed questions about this vital source of campaign information. One friend who formerly worked in market research said in her company there were problems with response rates for the online polls, another issue the media could investigate.
But polls can also be quite accurate. There is randomness to the order of things. It is entirely possible for a poll not to be done well, but still be accurate. Even hamburger polls have been right. I am not aware of any burger polls this year and I lament their passing – I like hamburgers. A burger poll at the Pacific National Exhibition in 1972 correctly anticipated the demise of B.C.’S Social Credit dynasty. In the 2003 Ontario provincial election the Lick’s Hamburger Poll outperformed one of the commercial pollsters that year. Now there is a media polling story I would love to read in 2015.