Editor’s Note: As peoples’ lives change, how we conduct market research surveys must change too. This has been an everlasting challenge for the industry. Door-to-door interviewing became too difficult several decades ago when women entered the workforce in large numbers, and there were just too few housewives at home. This hasn’t stopped, and technology changes are a key driver of change. As Jennifer Reid writes, email invites get ignored when more people live their lives via text messaging. The experiment she describes comparing chat surveys done via text to emailed surveys, is a clever approach to the latest challenges, and one that will be interesting to see further developed over time.
While chatbots aren’t new, the insights world is just starting to wake up to their potential. Mike Stevens of Insight Platforms recently revealed that more research vendors are now offering chatbot-enabled solutions that promise to make it easy and more seamless to engage customers and get insights faster than ever before. Market research is on the cusp of entering the chatbot era.
When new research trends emerge, it’s critical that we take a step back and understand their impact to data quality and the experience of respondents. Any new technology that does a fine job of getting qualitative and quantitative data but fails to deliver a good user experience is a non-starter. A good experience means people are more likely to come back and continue to provide their honest feedback. On the other hand, a subpar experience hurts the reputation of the research department and diminishes the overall brand of the company.
Given that more insight teams are now exploring the use of chatbots for research, the time is right to put this tech under the microscope. At Rival Technologies, a core product in our platform is what we call chats—chatbot-enabled surveys hosted through messaging apps and web browsers. My role as a Senior Methodologist in our company is to test our own solutions and make sure that they are up to snuff, so I’ve had the opportunity to get my hands dirty with this new technology.
Recently I ran a study to examine the impact of chat surveys on the respondent experience. The study also looked at whether the data and responses you get from chats are significantly different from traditional online surveys, and what impact, if any, do chats have on the demographic composition of your sample.
Why messaging platforms?
Before I dive in and share results from our study, I’d like to address one potential question from market researchers: why send surveys via messaging platforms?
The answer: email overload. Respondents are getting so many emails and are ignoring survey invites as a result. A 2017 study found that 74% of consumers are overwhelmed by emails. Adweek recently revealed that more than 50% of emails are deleted without ever being read.
This may sound dramatic, but finding other ways of reaching respondents other than email surveys is critical to the industry’s future. Messaging and SMS have become popular ways for people to communicate with friends and family, so these are channels that we need to figure out as an industry.
A parallel study comparing chats with traditional online surveys
To compare the chat experience versus the traditional survey experience, we did a multi-stage project in July 2018. We used the same set of questions for both chats and traditional surveys. We chose a topic that was relevant to most people: attitude on sunscreen use.
The sample for both surveys came from InnovateMR’s panel and river sample. At the end of the surveys, we asked people about their experience. Here are some notable takeaways.
Takeaway 1: Chats deliver a more enjoyable respondent experience.
The study asked respondents about four different aspects of their experience: enjoyment, fun, ease, and length.
Regarding the first two, the data is very clear: people like the experience with chats much more than they do with traditional online surveys. An overwhelming 88% of participants told us they found the chat experience either “much more” or “somewhat more” enjoyable than other surveys they’ve taken in the past. On the other hand, only 68% of those who took the traditional survey agreed to the same statement. (We applied survey design best practices for the traditional survey.)
Open-ended responses revealed that respondents liked how chats felt more like a conversation rather than a survey. In my opinion, the fact that chats allow a back-and-forth with the respondent and makes it easy to use emojis, GIFs and videos also contribute to the good feedback we received about the experience.
Takeaway 2: People find chats easier to complete
The chat interface is familiar to respondents since chats use the same channels people use to message their friends and family. But as a channel for sending out surveys, chats are new. I was curious if research participants found the new experience difficult. The data shows chats are seen as at least as easy or slightly easier to complete than a traditional survey.
Our data also shows people who took the traditional survey and those who did the chats had similar perception of the length of the activities. This isn’t surprising since we used the same number of questions for both activities. It is, however, worth noting that for the traditional survey, we asked fewer questions than most surveys would.
Takeaway 3: Chats do not introduce demographic skews
I was pleased to see that people love chats, but as a veteran researcher, I wanted to know if this new way of engaging consumers attracted certain demographics more than others.
Chats lend themselves well to distribution via SMS and messaging platforms—channels that are very popular with younger consumers. So while it makes sense that a younger demographic would find chats appealing, some researchers might wonder if this new way of engaging consumers would alienate other demographics.
Our finding: No skew was introduced into the data as a result of sending panel sample to a chat as opposed to a traditional survey. For one, there are no real significant differences between the two groups in terms of age distribution. There is a small increase among younger (under 44) consumers in the chat data but nothing that would suggest that an older generation was put off by the new survey method.
Gender, area (rural, suburban and urban), education and race distribution were also comparable in the two groups. You can download the full report if you’d like to see all the charts.
No big differences in data either
So, you might be wondering about the actual results of our chats and surveys. (As a reminder, our questions were about sunscreen usage and attitudes.) Did the “chat people” have a different view from those who took the traditional survey?
Not really. The results from the two groups were actually very similar. The same conclusions can be drawn from the two data sets.
This isn’t super surprising. After all, research participants for this study came from the same sample and, as we discussed earlier, had similar demographic composition. The chat itself does not introduce any skew when questions are the same and the sample source is the same.
More research on research on the way
We’re in the super early days of chats, and my intention is to continue to test this innovation as the technology evolves. In particular, keeping an eye on respondent experience is a big priority for me and my team.