Editorial Peter Mouncey pp. 143–152[Download PDF] In this editorial, Peter Mouncey previews the articles in volume 59(2) of IJMR, a special issue covering papers from the Association of Survey Computing (ASC) seventh international conference on the theme 'Are we there yet? Where technological innovation is leading research'. The topics include a review of how technology has been embraced by market researchers over the past decade; crowd-sourcing at £0.08 per interview; modularising internet surveys in the mobile age; developing effective dashboards; and developing a programme of SMS-based research. He discusses his panel on whether technology had lived up to its promise to make survey research "better" over time. Peter then congratulates the winners of the MRS Silver Medal, MRS Award for Innovation in Research Methodology, IJMR Collaborative Award and the IJMR Reviewer of the Year. He also highlights an article by William Davies, which analyses the current crisis facing statistics. Published 20 March 2017
Viewpoint: ‘Is technological change threatening the very existence of “traditional” survey research and, if so, what should we do about it?’ Mike Cooke pp. 153–156[Download PDF] In this viewpoint, the author explores the impact of technology on "traditional" survey research and whether technological change threatens the research’s existence. He argues that the 'survey method' is not a traditional methodology but rather a paradigm that must change when the situation requires it, as it adapted to the invention of the telephone and internet. The emergence of the digital society and the new cultural ecosystems have created a new challenge for researchers to respond to. The real threat to 'traditional research' is the increasing inability, in the commercial sector, to draw representative samples at a cost-competitive price. Researchers will link datasets that were not collected for a specific research purpose and use them to add insights to more traditional survey data, such as qualitative and ethnographic data. Published 20 March 2017
Forum: The practicalities of SMS research Petra van der Heijden pp. 157–172[Download PDF] This paper describes a survey based on SMS messaging, presenting details of the tests and pilots undertaken, the practical difficulties found and overcome, as well as an examination of the differences found between the CATI and SMS elements of the survey. It also describes pilot surveys undertaken to test mixed-mode methods: SMS-to-online and SMS-to-IVR (interactive voice response). As the paper describes, Network Research has a long-standing relationship with a top service provider, which interacts with its customers through high-street outlets, by phone and online. For the past 15 years and more Network Research has conducted customer satisfaction surveys to cover all those interactions, mainly using computer-assisted telephone interviewing (CATI). Throughout the relationship with this client, Network Research has periodically investigated different means of data collection as an alternative or adjunct to CATI. Its primary motivation has been to seek more cost-efficient ways of collecting timely data (proximate to a specific event) from the customer base. Cost saving is not the only criterion, however. Consideration has also to be given to impact on the customer; attribution, confidentiality and data protection issues; potential biases leading to skewed and unrepresentative scores (non-response, age, gender, geographic); representing outlet and organisation structure; and scalability. Considerable time and effort has gone into refining the CATI approach to render it as cost – and methodologically efficient as possible. Nonetheless CATI is still, relative to other options – like online or other self-completion methods – a higher-cost approach. However, it is also the ‘gold standard’ on each of the non-cost criteria above. Pure online surveys have limited application for this client: among all but digital customers, very few customer email addresses are available. Where online research is conducted, the surveys generally suffer from low response rates. In 2015, Network Research started to supplement CATI data for the largest of the customer satisfaction surveys, among high-street customers, with a survey based on SMS messaging (or 'text messaging' via mobile phone). This paper will be of interest to anyone who is contemplating using SMS methodology. Published 20 March 2017
Observations from 12 years of an annual market research technology survey Tim Macer and Sheila Wilson pp. 173–198[Download PDF] Against the theme of this year’s conference, 'Are we there yet? Where technological innovation is leading research', this paper provides evidence-based observations on where technology has led research in the recent past, and where it appears to be leading now. meaning ltd has conducted a survey of market research companies around the world each year since 2004, looking at their use of technology. The survey includes a combination of tracking questions on the adoption of technology-based methods to detect long-term trends, as well as topical questions that vary from year to year, some of which are also repeated in subsequent years to measure change. Now, with 12 years of data, it is possible to draw a number of conclusions about the way in which the research industry interacts with the technology that supports it, and to understand some of the transformations that have taken place. It is also possible to look at some of the areas where the rhetoric and actual experience on the ground have not been aligned, and identify some of the challenges that may be unique to the research industry. Developers need to be mindful of such challenges if they are to succeed in providing useful, appropriate technology that will allow the market research industry to continue to develop, adapt and compete for survival. The aims of this paper are to: quantify significant changes that have occurred in the last decade, as well as identify those where anticipated change has been slow to materialise; examine some perennial difficulties that the market research industry appears to have with technology development, adoption and diffusion; highlight some of the current and ongoing changes that emerge from the data; identify areas that those developing or providing software need to pay particular attention to when supporting these changes; and provide specific recommendations to researchers and technology providers. Published 17 March 2017
Are interviews costing £0.08 a waste of money? Reviewing Google Surveys for ‘wisdom of the crowd’ projects G.W. Roughton and Iain MacKay pp. 199–220[Download PDF] This paper investigates whether a 'wisdom of the crowd' approach might offer an alternative to recent political polls that have raised questions about survey data quality. Data collection costs have become so low that, as well as the question of data quality, concerns have also been raised about low response rates, professional respondents and respondent interaction. There are also uncertainties about self-selecting 'samples'. This paper looks at more than 100 such surveys and reports that, in five out of the six cases discussed, £0.08p interviews delivered results in line with known outcomes. The results discussed in the paper show that such interviews are not a waste of money. Published 17 March 2017
Shorter interviews, longer surveys: Optimising the survey participant experience while accommodating ever expanding client demands Harvir S. Bansal, James Eldridge, Avik Halder and Roddy Knowles pp. 221–238[Download PDF] This paper explores strategies on how to best balance expanding survey length with the need for concise, relevant and engaging surveys, deployed in a device agnostic format. When designing a survey we, as an industry, are often seeking a balance between competing design challenges: clients have diverse and extensive objectives, survey participants have short attention spans and an ever increasing suite of connected devices to choose from. Survey participants are voting with their feet when surveys are not compatible with the device they want to use, whether that is the smart device in their pocket or laptop they are working on, and this is very real for online panels. We are seeing increased abandon rates, with the effects of extended fieldwork times, smaller pools of sample to draw from and the possibility of introducing bias into our data. Having spent much of 2015 working with clients to design more smart-device friendly surveys, Research Now has explored innovative ways to shorten survey length without compromising on the amount of material covered. Following on from work by Johnson et al. (2014), Research Now conducted a piece of primary research exploring survey modularisation as discussed in the current paper. The approach splits questionnaires into modules, with participants receiving only a specific module, a subset of the overall survey. It is expected that a long questionnaire can be split and – when applied appropriately, designed properly and implemented effectively – data can yield results comparable with a full non-modular survey. Building on previous industry work on this topic, and primary research conducted by Research Now, we discuss our methodology, the results and conclusions from this work, and explore opportunities to automate the approach. The overall goal of this study and resulting paper is to explore how adapting survey research in this way improves rather than complicates the lives of both researchers and research participants. If we are not able to shorten our surveys, then survey modularisation may prove to be our best hope for a complete, representative dataset and we need to ensure that this is achieved accurately, confidently and efficiently at scale. Published 17 March 2017
Successful dashboard implementation in practice: How to overcome implementation barriers and ensure long-term sustainability Alexander Skorka pp. 239–262[Download PDF] This paper examines how dashboard applications might be transformed in order to maintain interest and user attention. Although dashboards are an integral part of today’s marketing and market research environment, many dashboard applications share the unfortunate downside that, over time, the dashboard becomes less interesting and might be neglected by the user. The upside, however, is that you can do something about it. Consider the following questions:
Are you backed by your senior management?
Does your dashboard concept fit your corporate culture?
Does your dashboard add value and does it support the user in their daily management tasks?
Added Value 1: Have you decided on the right KPIs?
Added Value 2: Are the data easy to understand?
Added Value 3: Are you following a call-to-action approach?
Added Value 4: Is your dashboard designed to gain insight?
Added Value 5: Does your dashboard encourage user to take action?
Do you have a dashboard vision?
If you are able to answer all of these questions with ‘Yes’ right away, congratulations! You are a dashboard pro. If you can’t, here is some food for thought for you that will help transform short-term dashboard hype into a sustainable success story. Published 17 March 2017
Book review: The twelve powers of a marketing leader: how to succeed by building customer and company value, by Thomas Barta and Patrick Barwise Malcolm McDonald pp. 263–265[Download PDF] This book review looks at 'The twelve powers of a marketing leader: how to succeed by building customer and company value', by Thomas Barta and Patrick Barwise. The book is aimed at practitioners but should have value for academics. It positions itself as a 'leadership book for marketers'. The twelve powers are divided into four categories: mobilising your boss; mobilising your colleagues; mobilising your team; mobilising yourself. The book argues that successful leadership is not just about personality. The findings are based on high quality data and analysis. The reviewer highly recommends the book, only criticising the quality of the print and the layout. Published 17 March 2017
Book review: Mindframes: 6 enduring principles from 50 years of market research, by Wendy Gordon David Smith pp. 265–267[Download PDF] This book review looks at 'Mindframes: 6 enduring principles from 50 years of market research', by Wendy Gordon. The book is extremely insightful and useful for qualitative researchers. It features six key mindframes: the unconscious; making sense of difference; liking; why we behave like we do; language – beneath the surface of words; context – ripples of meaning. The reviewer observes that the fundamentals of market research may be changing over time, that they would like to see business acumen as a seventh mindframe and that the mindframe perspective should be utilised outside of qualitative research. Published 17 March 2017