PureSpectrum - Schedule A Demo
Our new GreenBook Directory site is live!
COVID-19 guidance, tips, analysis - access full coverage here

Why Respondents Don’t Like Participating In Research (And What We Can Do About It)

The newly released GRIT CPR (Consumer Participation in Research) study showed that the majority of the people who have willingly given up their time, often for little or no reward, are dissatisfied with their experience participating in research.


In the Q3-Q4 2016 edition of the GRIT Report, we asked participants to rank various factors in importance when designing a study. Respondent Experience was at the absolute bottom of the list, which we found quite alarming. Participants are the lifeblood of market research, and disregarding the respondent experience in the research process is counter-productive to say the least.



Customer-centricity, user experience, engagement and design are the heart of product development and marketing, but yet are hardly even a consideration in research. In years past it didn’t have to be, but that is a legacy perspective that MUST be jettisoned in order for our industry to be effective in the 21st century. More than that, it’s necessary to survive as an industry: many, many options outside of the traditional MR space now exist that insights buyers can use to get the needed information to support their decisions; we stopped being the only game in town long ago. Those competing approaches have user experience built into them from the ground up and often reinforce brand relationships with consumers overtly.

As a researcher I get it; every time we field the full GRIT study I get an earful from other researchers about their experience with the survey design, usually negative. It’s easy to get defensive and rationalize those concerns away (and perhaps even rightfully so!), but the bottom line is people have a choice on how they spend their time, and if we are asking for some of that time and don’t make it a good experience then we run the risk of becoming like that friend or family member who always is asking for a favor of some kind (that we don’t want to do), so we just start ignoring them as much as possible. Or even worse, research starts to be equated with other unpleasant things like going to the DMV, preparing taxes, Dental appointments, or cleaning cat litter boxes!


We weren’t the only ones who found this situation to be cause for concern, so we reached out to various key stakeholders in the industry and developed a concept for asking consumers directly about their experience participating in research. AYTM – Ask Your Target Market, Dalia, Focus Pointe Global, G3 Translate, GRBN, the Global Research Business Network, reportbook by IfaD, Lightspeed, Mobile Digital Insights (MDI), Multivariate Solutions, RECOLLECTIVE (Ramius Corporation), Reconnect Research, Research Now, SSI, Toluna, and Virtual Incentives all joined us in fielding this new GRIT CPR (Consumer Participation in Research) study in March of 2017.

The groundbreaking study was conducted in 15 countries and 8 languages among 6,208 consumers via online, telephone, and mobile-only surveys.

We asked questions surrounding types of research they participate in (qual and quant), frequency of participation, preferred method/device for participation, how they want to receive invitations, what rewards they want, the impact of survey design, and more. In doing so, we discovered a tremendous amount about how consumers view research, and much of it is less than optimal for our industry.

It’s time to bring the participant experience to the forefront, and this report is a key tool to help us do so.

You can download the full report and access the data here, but I’ve included some of the highlights in this post as well.

A key finding is that, in aggregate, only a quarter of all respondents globally are satisfied with their experience participating in research, indicating researchers lack of prioritizing the respondent experience shows through to respondents.



Additional eye opening findings are:

  • Over half of all respondents admitted that the design of a survey impacts their willingness to complete it. 



  • 45% of respondents said surveys should be less than 10 minutes in length. 



  • 1/3 of all respondents cite a desire to earn rewards or prizes as their primary reason for participating.



  • Cash may be King, but Virtual Cards are Queen: across all sample types, countries and demographics respondents want incentive flexibility.


Overall, the results of the study just reinforced our belief, set forth initially in the GRIT Report, that our industry does a poor job of putting the respondent first, despite having the means and knowledge to do so. We should capitalize on that and bring the participant experience to the forefront.

So what to do? Well, based on these data a “Top 5” priority action list could be:

1.) Go “mobile first” in designing studies.

2.) Stay under 10 minutes.

3.) Think like game designers, marketers, or UI experts when designing research.

4.) Respondents want a fair value exchange: reward them the way they want to be rewarded and give them choices.

5.) Use research as a brand engagement and relationship building opportunity.

Other ideas can be found in the recent GRBN Special Report: Improving the online survey user experience.

The GRIT CPR study is a global call to action for the entire industry: clients, suppliers, and everyone in between. We MUST change, or risk losing access to respondents.

What is the “so what” in all of this? We as an industry must change our ways, and respondents have just given us a pretty clear set of directions on how to do that. The way we have always conducted research may have met our needs in the past, but the world has changed and people simply expect more from their relationships, including research.

We’ve distilled the message from the GRIT CPR study into a blueprint for success: a three-part action plan that we believe will go far in helping the industry capitalize on these learnings and overcome the challenges we have identified.


Finally, what isn’t measured isn’t managed, so we encourage everyone to participate in the GRBN TRUST & PARTICIPANT ENGAGEMENT Initiative for UX benchmarking. You can find out more here:

If you want to explore the results of this study on your own, you can do so here:

The full report can be found here:

Please share...

12 responses to “Why Respondents Don’t Like Participating In Research (And What We Can Do About It)

  1. A very interesting report and some very important points made in the conclusions. The ability to pull off over 6000 interviews in 15 countries speaks highly for the partners in this study.

    The findings are indeed very clear. As an industry, we do need to be more respectful of respondents in terms of the tasks we ask them to perform and how they are remunerated.

    However, one conclusion I found hard to agree with from the study was “make it mobile first”. This is actually listed as the first of the “Top 5 Priorities” from the study. I have a different view and it is based on the data I had access to in this GRIT/CPR study.

    In fact, there were a number of key indicators in the study that might actually suggest we should in fact place more focus on the old methodologies like online interviewing using PC or laptop and get them into play more again.

    Let’s consider some of the findings:

    Firstly, mobile is not the preferred method of survey participation in the study. In fact, it is clearly “a survey done on laptop or PC” which is the most preferred – and by a long shot (52% for PC-based surveys and only 34% for mobile).

    Secondly, the preference for surveys on PC’s matches almost exactly the current incidence for the usual ways respondents participate in surveys, so it looks like this preference is based on real experience.

    Third, satisfaction with particular data collection methodologies also suggests lower satisfaction with Mobile data collection versus online PC surveys. Top 3 Box score show a 54% satisfaction rating for PC-based surveys versus 46% for Mobile. Even old-fashioned Face to Face gets better satisfaction ratings than mobile!

    Fourth and of some concern – there is a particular age bias for mobile (which you can find by accessing the raw data link). The percentage of participants in surveys who are “50 years and older” comes in at 39% for online PC surveys but only reaches 16% for mobile.

    This has always been a problem with mobile – its poor reach to older targets. And the reverse age bias is also noticeable. Four in ten of these mobile survey participants are aged 30 years or less. The “total” sample in this study suggests a figure closer to three in ten may be more representative.

    Fifth, the importance of the design of the survey apparently impacts participation rates, according to this study. Over a half of the study respondents indicate this factor was a turn-off. It is no great leap of imagination to posit that it is Mobile surveys that would suffer most from design issues, given the small screens, difficulty of showing image batteries and all the problems around quality of mobile networks in many countries.

    And that explains why over a half of mobile survey participants in this study agree with this bad design problem versus only 41% for PC surveys and 28% for personal surveys.

    It’s hard for me to see how “putting mobile first” is supported by this data? The (somewhat tarnished) gold standard here seems to be the old fashioned, online PC-based survey!

  2. Chris makes some great points, so I’ll focus on a different aspect. Because of the congressional race here in the Atlanta suburbs, we’ve been inundated with phone surveys. I’ll always answer one, because how could I not. But the quality of most of these surveys are atrocious and they go to the “respect the respondent” area. We had 18 candidates running and many of the phone surveys would start reading the list. Worse, you couldn’t stop until the list was completely done, even though I knew right off who I was voting for. And if the company that did this is reading, you could have had the robo-voice say thank you at the end.

    My advice – use the mom test. After you write a survey, call your mom and have her take it. She’ll tell you what doesn’t make sense.

  3. It feels like it’s time we as researchers work with our programmers, be it our suppliers and/or in-house teams, to incorporate UX best practices into and throughout the initial content of the survey, as opposed to applying UI best practices once the survey content has been finalized. I feel certain that mobile-first UX, best practice engagement practices, with meaningful rewards systems will let consumers know that we value them and their time. We at DMG ( will strive to do our part as a research team to take a different approach and build-in these practices into our survey development process and work with our suppliers earlier.

  4. I blog often and I genuinely appreciate your information. This great article has really
    peaked my interest. I am going to bookmark your site and keep checking for new
    information about once per week. I opted in for your RSS feed
    as well.authentic hockey jerseys China

Join the conversation