Download the

Who Are the Most Frequently Mentioned Research Panels?

Editor’s Note: In the upcoming GRIT Report we dive deep into understanding industry perception on sample quality and potential solutions to address issues there. Concurrently to the GRIT study, a group of friends led by Kerry Hecht Labsuris of Ramius and with the assistance of Tom Anderson of OdinText decided to look at panel participation from the panelist perspective. This is a loose follow up to the 2014 & 2015 GRIT CPR Reports, which analyzed the “freshness” of consumer participation in research. Today we feature a sneak peek of the findings, which will be presented in full at IIeX North America next month.

It also follows on the heels of a recent report by The Pew Research Center : Evaluating Online Nonprobability Surveys, which benchmarked multiple panel providers on key metrics to determine quality from a bias perspective due to sample characteristics.

The overall message from all of these efforts is that challenges remain in the sample industry, although most certainly some suppliers have gone far to address them already and others are working hard to do so. I also remain unconvinced of the impact of some of these issues for the bulk of commercial research, although I certainly share the concerns when it comes to social and political polling; look no further than the massive misses in the past few major election cycles for examples, which to be fair are perhaps even more owed to similar issues in telephone sample frames than online methods.

We’re going to keep working with many industry stakeholders to support and showcase all of the different efforts being made to address this fundamental aspect of research, as well to giver  platform to different perspectives. However I want to be clear on my personal view here: online research is the driving force of global commercial research and some of the leaders in the panel community are doing great work around quality and should be commended. Business disruption and market fundamentals buffet this subset of companies constantly and I am confident that these forces will be navigated successfully.


By Tom Anderson

Who exactly is taking your survey?

It’s an important question beyond the obvious reasons and odds are your screener isn’t providing all of the answers.

Today’s blog post will be the first in a series previewing some key findings from a new study exploring the characteristics of survey research panelists.

The study was designed and conducted by Kerry Hecht Labsuirs, Research Director at Ramius. OdinText was enlisted to analyze the text responses to the open-ended questions in the survey.

Today I’ll be sharing an OdinText analysis of results from one simple but important question: Which research companies are you signed up with?

Note: The full findings of this rather elaborate study will be released in June in a special workshop at IIEX North America (Insight Innovation Exchange) in Atlanta, GA. The workshop will be led by Kerry Hecht Labsuirs, Jessica Broome and yours truly. For more information, click here.

About the Data

The dataset we’ve used OdinText to analyze today is a survey of research panel members with just over 1,500 completes.

The sample was sourced in three equal parts from leading research panel providers Critical Mix and Schlesinger Associates and from third-party loyalty reward site Swagbucks, respectively.

The study’s author opted to use an open-ended question (“Which research companies are you signed up with?”) instead of a “select all that apply” variation for a couple of reasons, not the least of which being that the latter would’ve needed to list more than a thousand possible panel choices.

Only those panels that were mentioned by at least five respondents (0.3%) were included in the analysis. As it turned out, respondents identified more than 50 panels by name.

How Many Panels Does the Average Panelist Belong To?

The overwhelming majority of respondents—approx. 80%—indicated they belong to only one or two panels. (The average number of panels mentioned among those who could recall specific panel names was 2.3.)

Less than 2% told us they were members of 10 or more panels.

Finally, even fewer respondents told us they were members of as many as 20+ panels; others could not recall the name of a single panel when asked. Some declined to answer the question.

Naming Names…Here’s Who

Caption: To see the data more closely, please click this screenshot for an Excel file. 


In Figure 1 we have the 50 most frequently mentioned panel companies by respondents in this survey.

It is interesting to note that even though every respondent was signed up with at least one of the three companies from which we sourced the sample, a third of respondents failed to name that company.

Who Else? Average Number of Other Panels Mentioned

Caption: To see the data more closely, please click this screenshot for an Excel file.


As expected—and, again, taking the fact that the sample comes from each of just three firms we mentioned earlier—larger panels are more likely than smaller, niche panels to contain respondents who belong to other panels (Figure 2).

Panel Overlap/Correlation

Finally, we correlate the mentions of panels (Figure 3) and see that while there is some overlap everywhere, it looks to be relatively evenly distributed.

Caption: To see the data more closely, please click this screenshot for an Excel file.


Finally, we correlate the mentions of panels (Figure 3) and see that while there is some overlap everywhere, it looks to be relatively evenly distributed. In a few cases where correlation ishigher, it may be that these panels tend to recruit in the same place online or that there is a relationship between the companies.

What’s Next?

Again, all of the data provided above are the result of analyzing just a single, short open-ended question using OdinText.

In subsequent posts, we will look into what motivates these panelists to participate in research, as well as what they like and don’t like about the research process. We’ll also look more closely at demographics and psychographics.

You can also look forward to deeper insights from a qualitative leg provided by Kerry Hecht Labsuirs and her team in the workshop at IIEX in June.

Thank you for your readership. As always, I encourage your feedback and look forward to your comments!

Previously posted at

Please share...

6 responses to “Who Are the Most Frequently Mentioned Research Panels?

  1. Quick correction/addition. While OdinText picked up “e-Rewards” somehow it got excluded from the table when posting the results. It was mentioned by 3.96% respondents, and the respondents who mentioned e-Rewards said they were members of 4 panels on average.

  2. What seems to be interesting is that far from this being a chore, as seems to be the current convenient industry meme (shorter surveys are better, we are killing the respondents, most surveys are boring) these people seem active participants and whilst recognizing some task overload seem to be fairly positive to the tasks they are confronted with. Is this another myth biting the dust?

  3. @Chris, I’m not sure that’s the case at all. That data point was not included in the above analysis, but something we will cover subsequently of course. But looking at the data I can tell you, that even these ‘panelists’ who willingly participate in multiple panels do in fact complain about long surveys and irrelevant questions. More to come on that…

  4. The point I was making was based on the average respondent indicating they were members of more than 2 panels and there seems to be a long tail on that distribution. So either they are gluttons for punishment or they may be the lucky ones who find 5 minute surveys to complete – not likely in my view, but lets see the latest data.

  5. I have been taking online surveys for several years now. The main problems are: unnecessarily long surveys, boring surveys, repetitive questions and low rewards. The sad fact is that some companies take advantage of the respondents, knowing they really need the money and have to accept the unfair rewards. Further, when I know which company the survey is actually for (sometimes it’s very obvious that the survey is for Microsoft, Comcast etc even though there is an attempt to hide that) I often lose respect for that company. How can I have a high opinion of a company that thinks my time is worth so little? Especially knowing that many in that company earn a high hourly wage while paying me $1 for 30 minutes of my time.

Join the conversation