Authored by David Sutterby, Operations expert, and Director of Client Services for Ronin International.
Setting the Stage: The Scale of the Challenge
The afternoon opened with Debrah Harding, Managing Director of MRS, providing a comprehensive overview of the Global Data Quality (GDQ) Initiative. Her presentation immediately underscored the magnitude of the challenge facing the industry, with the estimated financial impact of data quality issues in the UK alone reaching a staggering £209 million annually. This figure encompasses everything from resource time and re-fielding costs to client compensation and investment in quality solutions. Not included, but essential to consider, is the cost of making bad business decisions due to fraudulent data.
Debrah's insights into the spectrum of fraud were particularly illuminating, highlighting how the industry faces everything from participants who view research as a side hustle to organised crime operations using click farms. With speed, price, and quality being the 3 competing factors in the equation for research buyers, increased focus on the first 2 of these elements causes a direct compromise of the third.
The introduction of the GDQ Benchmark Study represents a significant step forward in establishing measurable standards. It’s an ambitious and impressive piece of work aimed at giving the industry some concrete metrics to work with. The transparency this provides will be invaluable for providers and research buyers in understanding the situation. The upcoming ISO standard amendments and the launch of the ‘MRS Campaign for better data’ highlight key turning points which should stand as building blocks for the progress we need to make as an industry.
Navigating the Complex Sample Ecosystem
Chris Stevens was up next offering delegates a clear view of the intricate web of relationships that characterise the modern sample source ecosystem. He was definitely not alone in his view of the findings being eye opening, and a real concern where buyers are largely unaware.
Perhaps most striking was Chris's real-world audit example, which traced fraudulent participants through multiple sample exchanges and managed networks before reaching the end buyer. This illustration perfectly captured how quality issues can compound as they move through the ecosystem, with each intermediary applying different quality checks and processes. The lack of transparency about sub-sources and the absence of industry standards for quality checks emerged as critical areas requiring attention.
The research Chris presented on the effectiveness of various quality checks provided sobering insights for the industry and challenges some widely held assumptions about data quality indicators. His evidence-based approach to understanding what truly constitutes poor quality data versus legitimate participant behaviour is exactly the kind of rigorous analysis the industry needs.
The Promise of Standardised Feedback
Rebecca Cole and Jo Price's presentation on the Global Data Quality Feedback Loop represented one of the most practical and immediately applicable initiatives discussed during the event. Their proposed 18-point code frame for categorising data quality issues offers the industry a standardised language for discussing quality problems between sample buyers and suppliers.
The pilot study they conducted demonstrated the real-world applicability of this approach. By implementing the code frame on an actual project, they were able to break down their 11% removal rate into specific categories - revealing that 50% of removals were due to poor quality open-ended responses, while participant deduplication accounted for 19%. This level of granular insight enables much more targeted approaches to quality improvement.
What makes this initiative particularly valuable is its recognition that data quality is a shared responsibility. Rather than just a compliance tool for comparing suppliers, the feedback loop facilitates better understanding between all parties. Standardising this code frame across the industry would be a big leap forward. It would also distinguish which suppliers are taking this seriously, from those that would prefer not to!
Confronting Qualitative Research Challenges
The final presentation delivered by Liz Diez, tackled the specific challenges facing qualitative research, where the impacts of fraud and AI-generated responses can be particularly damaging to research outcomes. The working group's findings revealed that 70% of qualitative researchers are aware of participant fraud, with 67% having experienced it in their projects over the past year.
The distinction between fraudulent participants, professional participants, and fraudulent participation provided much-needed clarity in an area where terminology has often been imprecise. The real-world examples shared - from obviously AI-generated football responses to suspicious prescription documentation - brought these abstract concepts into sharp focus.
Perhaps most importantly, the presentation emphasised what fraud is not. The reminder that enthusiastic participants, poor quality responses, or people who prefer to keep cameras off aren't necessarily fraudulent helps prevent the industry from becoming overly paranoid and potentially excluding legitimate participants.
Shared experiences and approaches.
The event concluded with a candid panel discussion. Addressing the ‘price, speed, and quality challenge,’ Joe noted that we’ve all gotten used to expecting cheap panel but questioned if they're willing to pay for the people they want. She added that cost savings appear later on. These are crucial considerations. How do we engage clients? The view from the panel is education. Show them the costs of fraud, low quality, and the cost of impaired insight. We can also show the progress we are making and the initiatives discussed today can enable us all to have this tricky conversation. As Rebecca points out, ‘prevention is the key, rather than just detection’.
Looking Ahead
As the afternoon concluded with networking and continued discussions over drinks, it was clear that this event had achieved something significant. Beyond the valuable content shared by the speakers, it had created momentum for ongoing collaboration and established frameworks for addressing data quality challenges systematically rather than reactively.
Rather than individual companies trying to solve these challenges in isolation, the MRS Operations Network provides a forum for sharing experiences, pooling expertise, and developing collective responses. With data quality remaining such a critical concern, the commitment demonstrated at this event to ongoing collaboration, transparency, and innovation provides genuine optimism for the future integrity of market research.
And as Deborah points out, ‘If we don’t get this right, we face an existential crisis!’
No pressure then…
[MRS Company Partners can access recordings of past Operations Network events here]
Our newsletters cover the latest MRS events, policy updates and research news.
0 comments