The rise and fall of the opinion poll

Election polling has had some high-profile disasters in recent years. Sometimes it might be the ideology of the polls. In the 2012 US presidential election, Republican strategist Karl Rove, appearing on the Fox News election night special claimed that Mitt Romney would become the 45th president of the USA in a landslide. They got it horribly wrong – the election wasn’t even close. President Barack Obama was elected to a second term, with the national popular vote percentage being 51.1% to 47.2%, and the Electoral College vote being 332 to 206, for Obama and Romney, respectively. Rove and fellow Fox pundit Dick Morris had been persuaded by the numbers their pollsters had been giving them for weeks, showing a clear Romney lead. Those numbers were wrong, and they remained wrong right up until election night. Polls in Israel seriously underestimated Prime Minister Benjamin Netanyahu’s strength. He is quoted as saying: “I always lose the election in the polls, and I always win it on election day.” But, it is the U.K. example which has been the most flagrant in recent years. Pollsters predicted a close election only to see the Conservatives win easily. Polls are now as untrustworthy as the politicians whose popularity they measure. How did it come to this?

The first opinion poll was carried out in Delaware on July 24th 1824. It was conducted to predict the result of the US presidential election contest between Andrew Jackson and John Quincy Adams, and it predicted a Jackson win. It was wrong. The man most associated with is history is George Gallup, whose company, founded in 1935, is still going today. Gallup actually wanted to work in journalism; his goal was to become a newspaper editor. But when he was at university there was no journalism course, so he studied psychology instead, finally obtaining a PhD. He then set about measuring American public opinion. Initially he was not interested in elections. However, there was a significant credibility gap. How could he show that his surveys were accurate? He needed an activity where his results would be tested. He started asking people their voting intentions so that he could then predict an election. This was to prove extremely popular – people just loved these predictions. What began as a stunt to promote his real business just took off.

Gallup’s great triumph was in the 1936 U.S. presidential election. With a sample of 50,000 voters, Gallup’s organisation correctly predicted Franklin Roosevelt’s victory over Alf Landon. This was in contrast with Literary Digest, whose poll of over two million returned questionnaires predicted that Landon would be the winner. In polling it is quality not quantity. Gallup went further. Using a random sample smaller than theirs, but chosen with the same methodology, he was also able to correctly predict the results of the Literary Digest poll. The Literary Digest people had made heavy use of telephone directories. The trouble was that those whose names were included were not a random sample. They were disproportionately well-off voters, the type of people who hated the New Deal and saw FDR as a class traitor.

When public opinion survey research started in the 1930s, the response rate was well above 90%. People were less jaded and cynical about the political process. It was like a civic duty and there was the novelty factor. Someone would come to your door to talk about politics fro 45 minutes. Now in age of greater cynicism, information overload, the decline of landlines, robocalls and the rise of mass marketing the figure is around 10%.

This has important implications for polling accuracy. The characteristics of who agree to be interviewed may be markedly different from those who refuse. Some households use mobile phones only and have no landline. This tends to disproportionately include minorities and younger voters and occurs more frequently in metropolitan areas. These are just a couple of examples of the biases that can skew results. Another factor is known as response bias. This is how responses do not always reflect what the voters think. This may be down to unscrupulous pollsters trying to produce a certain result. But it might also be the respondents who are deliberately lying. This could be to screw the system or they may feel embarrassed to admit who they are voting for. In U.K. elections we talk about the Shy Tory Factor, in which Conservative supporters are reluctant to reveal their preference for the Nasty Party. The wording and order of the questions is also vital as was satirised in the Yes Prime Minister video.

Pollsters do now seem to be facing a crisis. They are now competing with data analysts. As I pointed out in the introduction, they have had a number of cock-ups in recent years. If they keep making wrong calls, the public will stop taking notice. They are up against the growth of Big Data. The star of the prognosticators at the 2012 U.S. presidential elections was Nate Silver. Curiously, he never conducted a single poll himself. Instead, what he did was aggregate the data from existing polls and other sources and put it through his own analysis to come up with his predictions. Silver and his team must have done something right; they correctly predicted the winner of all 50 states and the District of Columbia. Is this the end for traditional pollsters? We shall see. But I think opinion polls, love them or hate them, will continue to play a massive role in our elections.



One Response to The rise and fall of the opinion poll

  1. Alberto says:

    Congrats for the post.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: