Editor’s Note: In the upcoming GRIT Report we dive deep into understanding industry perception on sample quality and potential solutions to address issues there. Concurrently to the GRIT study, a group of friends led by Kerry Hecht Labsuris of Ramius and with the assistance of Tom Anderson of OdinText decided to look at panel participation from the panelist perspective. This is a loose follow up to the 2014 & 2015 GRIT CPR Reports, which analyzed the “freshness” of consumer participation in research. Today we feature a sneak peek of the findings, which will be presented in full at IIeX North America next month.
It also follows on the heels of a recent report by The Pew Research Center : Evaluating Online Nonprobability Surveys, which benchmarked multiple panel providers on key metrics to determine quality from a bias perspective due to sample characteristics.
The overall message from all of these efforts is that challenges remain in the sample industry, although most certainly some suppliers have gone far to address them already and others are working hard to do so. I also remain unconvinced of the impact of some of these issues for the bulk of commercial research, although I certainly share the concerns when it comes to social and political polling; look no further than the massive misses in the past few major election cycles for examples, which to be fair are perhaps even more owed to similar issues in telephone sample frames than online methods.
We’re going to keep working with many industry stakeholders to support and showcase all of the different efforts being made to address this fundamental aspect of research, as well to giver platform to different perspectives. However I want to be clear on my personal view here: online research is the driving force of global commercial research and some of the leaders in the panel community are doing great work around quality and should be commended. Business disruption and market fundamentals buffet this subset of companies constantly and I am confident that these forces will be navigated successfully.
By Tom Anderson
Who exactly is taking your survey?
It’s an important question beyond the obvious reasons and odds are your screener isn’t providing all of the answers.
Today’s blog post will be the first in a series previewing some key findings from a new study exploring the characteristics of survey research panelists.
The study was designed and conducted by Kerry Hecht Labsuirs, Research Director at Ramius. OdinText was enlisted to analyze the text responses to the open-ended questions in the survey.
Today I’ll be sharing an OdinText analysis of results from one simple but important question: Which research companies are you signed up with?
Note: The full findings of this rather elaborate study will be released in June in a special workshop at IIEX North America (Insight Innovation Exchange) in Atlanta, GA. The workshop will be led by Kerry Hecht Labsuirs, Jessica Broome and yours truly. For more information, click here.
About the Data
The dataset we’ve used OdinText to analyze today is a survey of research panel members with just over 1,500 completes.
The sample was sourced in three equal parts from leading research panel providers Critical Mix and Schlesinger Associates and from third-party loyalty reward site Swagbucks, respectively.
The study’s author opted to use an open-ended question (“Which research companies are you signed up with?”) instead of a “select all that apply” variation for a couple of reasons, not the least of which being that the latter would’ve needed to list more than a thousand possible panel choices.
Only those panels that were mentioned by at least five respondents (0.3%) were included in the analysis. As it turned out, respondents identified more than 50 panels by name.
How Many Panels Does the Average Panelist Belong To?
The overwhelming majority of respondents—approx. 80%—indicated they belong to only one or two panels. (The average number of panels mentioned among those who could recall specific panel names was 2.3.)
Less than 2% told us they were members of 10 or more panels.
Finally, even fewer respondents told us they were members of as many as 20+ panels; others could not recall the name of a single panel when asked. Some declined to answer the question.
Naming Names…Here’s Who
In Figure 1 we have the 50 most frequently mentioned panel companies by respondents in this survey.
It is interesting to note that even though every respondent was signed up with at least one of the three companies from which we sourced the sample, a third of respondents failed to name that company.
Who Else? Average Number of Other Panels Mentioned
As expected—and, again, taking the fact that the sample comes from each of just three firms we mentioned earlier—larger panels are more likely than smaller, niche panels to contain respondents who belong to other panels (Figure 2).
Caption: To see the data more closely, please click this screenshot for an Excel file.
Finally, we correlate the mentions of panels (Figure 3) and see that while there is some overlap everywhere, it looks to be relatively evenly distributed. In a few cases where correlation ishigher, it may be that these panels tend to recruit in the same place online or that there is a relationship between the companies.
Again, all of the data provided above are the result of analyzing just a single, short open-ended question using OdinText.
In subsequent posts, we will look into what motivates these panelists to participate in research, as well as what they like and don’t like about the research process. We’ll also look more closely at demographics and psychographics.
You can also look forward to deeper insights from a qualitative leg provided by Kerry Hecht Labsuirs and her team in the workshop at IIEX in June.
Thank you for your readership. As always, I encourage your feedback and look forward to your comments!
Previously posted at http://odintext.com/blog/look-whos-talking-part-1full-research-panels/