Discussing the latest landmark paper:

‘Qualitative market research: a conceptual analysis and review of practitioner criteria’, John Colwell (Middlesex Polytechnic), JMRS Vol. 32. No. 1, January 1990

At this year’s MRS annual conference, Impact 2017, IJMR is hosting four sessions, three of which are debates based on papers recently published in the journal. 

One of these sessions is based on Chris Barnham’s paper, published in IJMR Vol. 57 Issue 6, 2015, ‘Quantitative and quantitative research: perceptual foundations’. Barnham argues the case for a new perspective on the role of qualitative research, exploring how consumers view their world. 

The author discusses the distinctions between quantitative and qualitative research with quantitative research being grounded in recognised statistical theory, whereas there is no similarly accepted theoretical foundation underpinning qualitative research. Barnham addresses this absence by proposing a model of perception that provides qualitative research with a firmer theoretical foundation. In the conference debate, Nick Gadsby (founder The Answer) and Kirstly Fuller (CEO Flamingo) will provide alternative perspectives on this important topic.

However, Barnham’s concerns are far from new in the pages of the MRS journal. Back in 1990, John Colwell in this Landmark Paper argued that clients were faced with the dilemma that there were no obvious external criteria for evaluating the quality of qualitative research, which Colwell calls the ‘yardstick problem’. He contends that this means that incompetent practitioners cannot easily be identified. Colwell explores the origins of qualitative market research, identifying evidence of concerns about the quality of practitioners (the paper contains an extremely detailed and useful list of references covering the evolution and development of qualitative market research). 

In his discussion of what constitutes qualitative research, Colwell compares the stereotypes applied to quantitative and qualitative research – the former being ‘scientific, experimentally based, reliable, valid, trustworthy, expensive, but not actually providing much in the way of understanding and guidance, and of qualitative being the opposite’. 

However, Colwell argued, ‘the apparent absence of formal structure, and numbers, does not make qualitative research unscientific, and nor for that matter does the presence of formal structure and numbers make quantitative research scientific’, citing a paper from 1977 by Calder published in the Journal of Marketing Research. Calder also discusses the birth of ‘new’ qualitative research, championed by, for example, Bill Schlackman and the use of sensitivity panels in an attempt to provide a deeper understanding of consumer behaviour – in an era that predates the impact of neuroscience and behavioural economics. 

As Colwell cites, an MRS R&D subcommittee had concluded back in 1979 that ‘one is reminded of what they used to say in the lonely heart’s column about being in love, when you are you’ll know you are…. It’s the same with qualitative research; when it’s good you’ll know it’s good’. But, as Colwell argues, the focus in the absence of any obvious evaluative criteria is either the judgement of the client, or the reputation of the researcher. The author lists the criteria for a ‘good’ qualitative researcher, gleaned from a number of sources, but points out that there was no research into personality traits and any correlation with qualitative practitioners’ performance – it’s about recommendation and subsequent experience.

So what about skills? Colwell discusses whether an academic background or clinical experience are essential. The evidence is somewhat contradictory, but having market research experience, rather than marketing experience, was not addressed in any of the sources used by the author. The author also found little evidence of any widely accepted training processes for qualitative researchers. 

Nate Silver in his book, ‘The Signal and the Noise’, references the work of Philip Tetlock, comparing the performance of ‘Foxes’ with those of ‘Hedgehogs’ when it comes to accuracy in forecasting see my blog post on this. Hedgehogs believe in big ideas. 

Foxes believe in taking a multitude of approaches to a problem, and can deal with nuance, uncertainty, complexity and dissenting opinions. They are also more likely to be multidisciplinary, self-critical and thrive on complexity, whereas Hedgehogs are more specialised and seek order. Foxes rely more on observation than theory, compared with the ideological stance of the Hedgehog. Finally, Foxes are: ‘quicker to recognise how noisy the data can be, and are less inclined to chase false signals. They know more about what they don’t know. 

However, it is the stubborn Hedgehog who displays confidence in their predictions, whereas the Fox is more cautious and provides a probabilistic based, and usually more accurate, forecast – which may better reflect the noisy world of data today and the need for qualitative research. 

Are qualitative researchers more like Foxes or Hedgehogs? Regardless of appropriate skill/personality traits, clients often still seek the comfort provided by numbers, which they may, sometimes mistakenly, view as certainty.

Colwell concludes that the ‘yardstick problem’ remained, with ‘good (and objective)’ being replaced by ‘acceptable (to the client)’ as the key client evaluative criteria. If quality and reliability criteria are difficult to identify, then attempts to raise standards would remain elusive. Publicising the reasoning behind interpretation might, according to Colwell, also help clients become more discerning.

As Barnham demonstrates in his paper, the issues raised by Colwell are still far from being resolved. However, today’s world seems to need qualitative skills more than ever in developing a better understanding of underlying consumer behaviour, for example in decoding the meanings within social media conservations and understanding why quantitative measures of public opinion are delivering inaccurate predictions. Whether qualitative or quantitative, reliability remains fundamental in ensuring that research is a trusted source of evidence.

So, if you are attending Impact 2017, why not come along and hear what the panel has to say about this very important topic; maybe even share your views in the debate: ‘Redefining qual and quant’, Room 3, 10.45 am, Wednesday 15 March.

See all IJMR sessions at Impact 2017:

How to access the International Journal of Market Research (IJMR)

Published by SAGE, MRS Certified Members can access the journal on the SAGE website via this link.

Go to the IJMR website A white arrowA black arrow
0 comments

Get the latest MRS news

Our newsletters cover the latest MRS events, policy updates and research news.