April 2014 was billed as the ‘Billion-ballot’ month with elections in India, Indonesia, Afghanistan, Algeria, Hungary, Macedonia and Guinea-Bissau. I’m sure that in many, if not all, of these countries there will have been some attempts to forecast the outcome via opinion polling methods or social media analysis, with variable levels of accuracy.

In addition, the Economic and Social Research Council (ESRC) is using the September referendum in Scotland to fund research in this field to help gain a better picture of why we do, or don’t, vote and why we vote the way we do. The project could be extended to cover the possible UK EU membership referendum in 2017.

So 2014 is a very key time for research methods in this field, and therefore an equally good time to have recently launched an IJMR Call for Papers: Researching voting intentions (PDF). However, UK researchers in this field who have been around for a while will recall that in 1992 the polls in the UK got it seriously wrong.

The situation was bad enough for the efficacy of polling to be called into question, leading to a major investigation by the Market Research Society, commissioned by Adam Phillips, my successor as MRS Chairman. The urgent need to discover what had happened was such that the MRS report was published three months after election day.

This was a critical time for pollsters and the following year JMRS published a paper by one of the leading psephologists, Ivor Crewe, in which the author drew on the MRS report in a paper that discussed what went wrong and the lessons for the future.

I’ve selected this paper, not to simply to remind us of the fact that things can go seriously and publically wrong in the world of market research, and unlike commercial research, when the outcome doesn’t match the forecast, the outcome is there for all to see, the bodies can’t be buried. Whether we like it or not, it also still remains a virtually unique opportunity to see how good our methods are in predicting an outcome.

And again, unlike commercial research we can see how all the key players have performed – the winners and the losers. In 1992 all were losers. The outcome therefore could potentially affect the brands operating in this sector and their image throughout the research market. A lot is at stake.

What Crewe provides us is rather more than that. He provides a detailed description of the context at the time – the campaign strategies of the main parties, as these can have a major bearing on the pollsters’ task. The author then discusses in detail all of the key explanations that could have influenced the outcome, not all of which were under the pollsters’ control. In doing so, Crewe provides in-depth insights into the methods used at the time.

Crewe concludes with a discussion of the implications, including the roll of polls in influencing voter behaviour. If the polls had been accurately predicting what transpired, would that have led to a Labour victory, or more likely a hung parliament and a coalition government (!) – by understating Conservative support, they may have increased it by election day. 

Crewe claims that ‘Remember 1992’ will not just lead to improved methods, it will also have increased voter scepticism and, Crewe concludes, ‘the hypnotic power of opinion polls on the British electoral process has been broken’.

There have been other blunders, before and after 1992, but that campaign remains the major exception in UK polling history, where all key players got it wrong. Up to that point, luck might have played a part, but the lessons from 1992 were taken very seriously, leading to changes in methodology that continue to evolve (see the papers published in IJMR in 2009-11 following our last call for papers on this theme).

However, there are more players in the market, some with little experience of this field, and the advent of social media is changing the nature of how opinions are tracked and the methods used to make predictions. Also, the nature of politics in the UK has changed. We currently have a coalition government, the might-have-been outcome in 1992; there could be a four party contest with the rise of UKIP.

It is no doubt a challenging arena that pollsters will play in during the 2015 campaign in the UK. Finally, the media has been transformed. Crewe comments that in 1992, ‘Polls also play a role in the media’s politics of survival’ and are a key weapon in the newspaper circulation wars.

The saturation polling seen in the UK was, Crewe believes, due to the exceptional number of national mass-circulation newspapers at that time. Whilst the media remain major commissioners of polls, they now face fierce competition from other sources, especially social media. However, I don’t think the UK population has quite fallen out of love with polling, as Crewe forecast, or the media wouldn’t still be key commissioners.

So, it seems a good time to remind readers of the pitfalls facing the sector, in a paper written by an internationally renowned authority in this field.

Read Ivor Crewe's paper: A nation of liars? Opinion polls and the 1992 Election (published in JMRS Vol 35 No 4, October 1993).

Reflecting again on the ‘Billion-ballot’ month, we would love to feature the situations and experiences in those countries experiencing elections that don’t usually feature in IJMR submissions.

So if you, or someone you know, have been involved in such research, then please read, or pass on, details of our call for papers:

>> IJMR Call for Papers: Researching voting intentions .

How to access the International Journal of Market Research (IJMR)

Published by SAGE, MRS Certified Members can access the journal on the SAGE website via this link.

Go to the IJMR website A white arrowA black arrow
0 comments

Get the latest MRS news

Our newsletters cover the latest MRS events, policy updates and research news.