Categories
Research Methodologies
September 13, 2012
What’s the difference between a $6 respondent and a $25 respondent? Is a $25 respondent better in terms of recruiting practices?
0
Editor’s Note: All I can say here is don’t shoot the messenger folks! There is a reason that the the two aspects of MR that are facing massive new competitive threats are sample access and survey based research, and Jason explores a part of that reason in this post.
By Jason Anderson
In my role as a client-side researcher, I am blessed with an unbelievably large global customer database in the tens of millions of people. Not email addresses, but real people that I know exist in the real world by virtue of their purchase histories, credit card information, and behavioral metrics on our various online services. Because of this blessing, over 90% of the survey work I field is CRM-driven.
But there’s still that other 10%. And while I have a reasonable degree of confidence about what the sources of error and bias are in my CRM-based sampling efforts, my trust in panel recruiting erodes more and more every month. Consider, for example, a recent vendor selection experience:
Names obscured to protect the innocent. But nobody was “innocent” in this exchange, because from a distance it becomes obvious that the “value” of a completed survey is completely arbitrary and driven not by data quality or service quality but by a desire to win the bid.
Worse yet, I question whether that completed survey is even worth $6 to begin with. As an experiment, I joined one of the name-brand panels as a “panelist” under a pseudonym one month ago. I completed as little of the registration process as necessary to become qualified to take surveys. (Don’t worry, I haven’t polluted any of your actual work with fake responses. But I’ll come back to that in a moment.) Between August 3 and September 10, I received 25 survey invitations. That’s roughly 5 surveys per week. The panel’s frequency of contact guidelines explicitly say no more than one invitation every two days and no more than 12 per month.
“Oh, but it can’t be that bad! Most panelists are legitimate.” Let’s assume for a moment that this hypothesis is correct, and that panelists are recruited through completely legitimate efforts. For example, perhaps they were on Google and searched for “surveys” (see right).
Hmm. (And by the way: I’ve never been offered $20 to complete a panel survey. Which panel do I need to join?)
Creating a fake panelist account is fast and painless. Identity verification on the Internet is near impossible. But it’s not just respondents committing fraud in this process, the panels themselves are complicit. The lack of transparency in downstream processes invites opportunism, if not straight-up breaking of rules.
Consider: What’s the difference between a $6 respondent and a $25 respondent? Is the $25 respondent substantially better in terms of recruiting practices, data quality, and policy integrity? Or was the $25 respondent simply a $6 respondent that had been purchased from another source and marked up? And how can you tell the difference?
Answer: You can only tell the difference in quality if you are told which panels are being used, and how those panels manage their database. I haven’t found full-service research agencies to be terribly eager to share that sort of information, because:
So what’s a client to do? I have three rules for myself:
Disclaimer
The views, opinions, data, and methodologies expressed above are those of the contributor(s) and do not necessarily reflect or represent the official policies, positions, or beliefs of Greenbook.
Comments
Comments are moderated to ensure respect towards the author and to prevent spam or self-promotion. Your comment may be edited, rejected, or approved based on these criteria. By commenting, you accept these terms and take responsibility for your contributions.
More from Jason Anderson
Pokémon Go has been hugely successful in terms of adoption and engagement. How has Pokémon Go garnered such success so quickly? What can we learn, as ...
The European Court of Justice recently invalidated the Safe Harbor progrm. What are the implications for consumer research?
Maybe our own behavior, and the never-ending stream of surveys, has tainted the previously clean karma of the after-work phone survey.
For several years now, the insights industry has been talking about innovation. I struggle to remember what people talked about before.
Top in Quantitative Research
Why are we still measuring brand loyalty? It isn’t something that naturally comes up with consumers, who rarely think about brand first, if at all. Ma...
Sign Up for
Updates
Get content that matters, written by top insights industry experts, delivered right to your inbox.
67k+ subscribers