Research Buyers Guide

The authoritative guide to research suppliers. Go

Research Jobfinder

The No.1 jobs resource for research and insight professionals. Go

The definitive source of research news and opinion. Go

Fair Data

The Fair Data mark helps consumers recognise who they can trust. Go

International Journal of Market Research (IJMR)

The world authority on research methodologies and techniques for professionals and academics. Go

Geodemographics Knowledge Base (GKB)

For people interested in the application of geodemographics and geo-spatial analysis. Go



[Digital first] means papers that have not yet appeared in print. See all

Peter Mouncey Blog

Margins of error - statistics in the media

This was the title for a packed evening seminar held at Kings College London on 14 May, jointly run by RSS/King’s College/Ipsos MORI.

Subtitled ‘Public understanding of statistics in an era of big data’, this is one of a series of events celebrating the international year of statistics.

The aim of the evening was to explore why statistical literacy and trust in statistics remains relatively low, identify the implications and recommend solutions to improve understanding and trust. A great evening, with excellent and eminent speakers, but far too little time was allowed for discussion of the important issues they raised.

I wanted to ask if one problem was the devaluing of statistics in the media, the ‘a survey says’, or as Joan Bakewell eloquently said in a Guardian article some years ago: 

'...surveys themselves need checking out. They are not value-neutral, laboratory-style findings. 

'They are often done in the street – in bad weather – by people with clipboards, who would prefer to be doing something else, with passers-by who do their best to avoid eye contact but relent before the supplicant’s ‘It will only take a minute’. So why is this far from exact method used as reliable evidence for claims that can frighten us, threaten us and cajole us – but rarely reassure us?'

This leads to the challenge for the public in detecting ‘the signals in the noise’ – distinguishing the serious statistical messages from the trivia, or worse…

I didn’t get the chance to ask the question, but if I’d read that day’s Guardian more thoroughly before I attended, I certainly would have made sure I did. After getting home I found an article headed ‘Gove’s attack on pupils’ grasp of history relied on PR surveys’. 

A Freedom of Information Act request had identified that our current Education Secretary’s attack on history teaching was based on press releases of polls conducted by UKTV Gold and Premier Inns, Lord Ashcroft and the Sea Cadets!

Gove is quoted as saying in the Mail on Sunday "Survey after survey has revealed disturbing historical ignorance, with one teenager in five believing Winston Churchill was a fictional character while 58% think Sherlock Holmes was real".

As the Guardian article claims: 'None of the pieces included links to the original research, and none of the articles cited stated whether the research was commissioned by professional polling companies, or met the standards of the British Polling Council’ – or I would think, met best market research practice and the CIPR/MRS/RSS guidelines? Who knows!

At the seminar there was much talk of the need to build trust; apply codes of practice etc, but if government ministers can make controversial statements which have potential major policy implications based on such shaky statistical foundations, what hope is there for us to build trust in statistics with the wider public, or with key professional groups in society such as teachers?

As Bobby Duffy demonstrated from Ipsos MORI data, 54% of UK adults have no trust at all in politicians and the proportion has steadily fallen over the years, and for journalists and government ministers.

It is also not surprising, as pointed by Denise Lievesley, that trust in statistics within the UK is also very low, compared to other EU countries. We have a mountain to climb!

Comments (6)

  • Dick Stroud | July 13, 2013 5:25 am

    Peter, thanks for this blog posting. Interesting and a little depressing. We seem to be living in a ‘research says’ culture where politicians, the media (especially the BBC), companies and NGOs find it necessary to sprinkle a few ‘facts’, often from unpublished research, into their messages to gain greater impact. Worse still, I suspect that the audience filters these research-based messages to reinforce their own prejudices. I am not sure how we improve the situation other than being sensitive to the problem. This then raises another issue – do we only challenge those ‘research says’ facts that we disagree with?
  • Annie Pettit | July 3, 2013 12:37 pm

    It disappoints me when people point out the inherent problems of access panels. When it comes to obtaining sample for market research purposes, there are NO probability samples anywhere. Focus groups, in-person interviews, communities, access panels, social media research - ALL of these suffer from non-random sampling. Heck, even though it is legally required to answer the census, the response rate is never 100%. People will never comprise a random sample. So the issue is this. Get over it. Understand your sample. Know the strengths and weaknesses. Learn how to generalize appropriately. If you're unsure, say you're unsure. Market researchers aren't gods. Rant over. :)
  • Mike Thompson | July 2, 2013 12:59 pm

    Before we educate everyone else about stats I think we should start at home. The level of knowledge amongst MR professionals is lamentable. However, if we encourage or at least do not tell the truth about access panels on how unrepresentative they can be we are not going to get very far with making our profession more awrae of even the basics of stats. It is apparent that most access panellists appears to be carried out with people working with around 5 access panels and completing around 6 surveys a week (this comes from a meeting that was held at an MRS meeting concerning access panels) a year or so ago. I would be interested to hear from access companies concerning the implications of this and also from clients that know this but still use them. I should admit we have to use such panels ourselves because the client asks for them although I make sure they understand the possible implications.
  • Hilary Burrage | June 20, 2013 10:53 pm

    Cheers Peter. Of course the Census is not the 'only' (!) matter of concern. It's also terrifying that apparently Ministers are knowingly mis-using and otherwise abusing statistics to over-ride what would with proper data take them to different conclusions and outcomes. The Economist doesn't usually overstate the case in matters of this sort! Who should those worried about all this turn to? Hilary
  • Peter | June 20, 2013 1:20 pm

    Thanks, Hilary, I completely agree with your view about the Census, a topic I covered in an earlier blog. I also return to the abuse of stats in my next Editorial in IJMR (55/4 in July). 
    Peter Mouncey
  • Hilary Burrage | June 20, 2013 9:08 am

    You will be interested to know that The Economist has also addressed this issue recently: . In their words, "... cabinet ministers operate like feudal barons, rarely sharing research or working with colleagues.... Academics who rely on census data, such as Danny Dorling, a demographer at the University of Sheffield, fear that the government is determined to scrap the census no matter what." Why isn't this front page news? We have been warned....

Leave a reply

Please enter the 3 red symbols only