Research Technology (ResTech)

June 6, 2014

The Newest Research Technologies Need Solid, Traditional Thinking Toward Their Use

New techniques tempt us to use them, and in so doing invite the purpose-process disconnect. Let’s take the time to choose them correctly.

Paul Conner

by Paul Conner

0

http://www.dreamstime.com/stock-photo-energy-saving-bulb-tradition-bulb-image25222140

 

By Paul Conner

Tom H.C. Anderson said it very well recently in a LinkedIn discussion group:

“Too often data is chosen for analysis because a vendor or boss tells you it is important rather than [supporting] a decision originating with the original research question to be solved. Arguably, nowhere is this currently more true than the pre-occupation with twitter analytics; chosen because it’s free and easy to get to, yet representative of less than 8% of the population and severely lacking in signal quality.”

— Tom H.C. Anderson in The Future of Market Research.
http://www.tomhcanderson.com/2014/02/26/the-future-of-market-research/.

Our industry is currently experiencing outstanding new data collection and analysis technologies that allow us to practically explore what’s on consumers’ and shoppers’ minds and hearts, both consciously and non-consciously, and without being intrusive in our measurements. This particularly includes, as Tom Anderson notes, social media analytics, as well as techniques from neuroscience and social & cognitive psychology. This is wonderful technology that allows us the opportunity to provide our clients insights that reflect how consumers and shoppers make their decisions.

However, to Tom’s point, all of this new glitter can be blinding. How many of our clients, and how many of us, have become so enamored with the technologies that we forget that they are nothing more than new tools that help us make critical, yet fundamental, decisions that we have always made with the information at our disposal? Should we launch this product, or not? Who should we target? How should we promote our product? What should we charge? What channels should we use? How do we display our products in stores or online? The list goes on and on.

Most of the applications we face are not new. But even if some new applications have crept in recently (e.g., should we develop an APP for our product, or not?), the fundamental way in which we use the information at our disposal, whether it comes from self-reported accounts of how people feel about a product or facial coding that indicates how people feel about a product, should not be new. Namely, the information we collect should be specifically relevant to the decisions we have to make.

For example, if an application is to decide whether or not to pursue production of a new product, and if the criterion for making that decision is “pursue production if at least 10% of the general population is aware of the product category,” then why would one need to conduct a facial coding study to see how people feel about the product category? One may argue that how people feel about the product category should be a criterion for the decision, which would make facial coding a relevant method, but that misses the point. For whatever reason, awareness may be the lone criterion and, if so, facial coding should not be considered as a method.

Despite the logic above, new techniques are often chosen because they are new rather than because they address a specific, well-thought out, reasonable set of criteria. And in choosing a technique because it’s new, or leading-edge, or receiving substantial media coverage rather than because it’s relevant, how often has our research ended up in the “what do I do with these data” pile?

In The AIM Process: A Systematic, Stepwise Procedure for Improving the Actionability of Marketing Research, I deal with this issue. I call it the “purpose-process disconnect.” Marketing research is an applied discipline, the purpose of which is to support marketing decisions and actions. But too often marketing researchers and their clients start the research design process with a technique or an information-based question in mind, rather than the actual decisions and actions (a.k.a. applications) they have to make when the research is completed and the criteria upon which they will base those decisions and actions. In other words, much of the time applications are disconnected from the process.

From the book, here are some examples of requests I’ve received to start the research design process:

• “Hey Paul, I need some research to see how people like all the components of our new service. Can you write me up something on that?”

• “Hey Paul, we need some focus groups next week to see what people think of some names for our product. Can you set those up for me?”

• “Hey Paul, this weekend we need some research to find out if people are aware of our advertising campaign or not and we need to find out who is and who isn’t. I know it’s Thursday, but can you please get that done so that we can report to our CEO on Tuesday?”

What’s missing from each request are the specific decisions and actions to be taken on the basis of the research (applications) and the rules or guidelines upon which those decisions will be made or actions will be taken (applications criteria). Because of this, research requested and carried out in this way has a great chance to be unactionable.

Our new techniques tempt us to use them, and in so doing invite the purpose-process disconnect. They are wonderful techniques, but let’s take the time to choose them correctly. Together with our clients, let’s take the time to clearly define our applications and their criteria, admittedly a challenging endeavor to say the least. If the new techniques fit, then by all means, use them! And let the insights that follow justifiably reflect the glitter they bring to our industry.

For more information about The AIM Process: A Systematic, Stepwise Procedure for Improving the Actionability of Marketing Research visit http://www.theaimprocess.com. The book is available in print and Kindle on Amazon.com at http://amzn.to/1goYa74.

0

emerging technologiesinnovationneurosciencesocial media

Disclaimer

The views, opinions, data, and methodologies expressed above are those of the contributor(s) and do not necessarily reflect or represent the official policies, positions, or beliefs of Greenbook.

Comments

Comments are moderated to ensure respect towards the author and to prevent spam or self-promotion. Your comment may be edited, rejected, or approved based on these criteria. By commenting, you accept these terms and take responsibility for your contributions.

More from Paul Conner

Research Methodologies

Some Help in Evaluating Subconscious, Implicit, System 1 Measures

If we claim a measure is “implicit”, let’s define the implicit features we’re measuring.

Paul Conner

Paul Conner

Research Methodologies

Exploring System 1 A Little More Deeply

Four areas in which System 1 operates and manifests itself that combine to reveal “hidden forces” that influence behavior.

Paul Conner

Paul Conner

ARTICLES

Moving Away from a Narcissistic Market Research Model

Research Methodologies

Moving Away from a Narcissistic Market Research Model

Why are we still measuring brand loyalty? It isn’t something that naturally comes up with consumers, who rarely think about brand first, if at all. Ma...

Devora Rogers

Devora Rogers

Chief Strategy Officer at Alter Agents

The Stepping Stones of Innovation: Navigating Failure and Empathy with Carol Fitzgerald
Natalie Pusch

Natalie Pusch

Senior Content Producer at Greenbook

Sign Up for
Updates

Get what matters, straight to your inbox.
Curated by top Insight Market experts.

67k+ subscribers

Weekly Newsletter

Greenbook Podcast

Webinars

Event Updates

I agree to receive emails with insights-related content from Greenbook. I understand that I can manage my email preferences or unsubscribe at any time and that Greenbook protects my privacy under the General Data Protection Regulation.*