It’s been a while since I actually posted a blog that I have written. Not for lack of desire or even content to share, but simply because I’ve been incredibly busy with work related to Gen2 Advisors, Insight Innovation Exchange, and GRIT. Thank goodness that there has been a flood of fantastic submissions from our network of GreenBook Blog authors to fill the gaps!
What’s been interesting through these last few weeks of busyness is that despite the varied nature of the projects I’ve been working on, there has been a common theme running through out them all: shifting research models. To be clear, this work hasn’t been driven by preparing for some future change, but rather because the shift has been underway for some time and is now gaining momentum.
This has been fueled by many conversations with client-side organizations, technology companies, research suppliers, and thought leaders that I respect such as Gregg Archibald, Todd Powers, Alec Maki and many others. This post incorporates elements of the various dialogues I’ve been having with these folks in an attempt to synthesize it all.
When looking at this quiet rEvolution in the research/insights industries, start by considering the example of social media. The recent rise of, and fascination with, all things social in the digital ecosystem reflects many of the changes that are sweeping the marketplace. This naturally touches many other areas of the insight industry, including topics such as big data, mobile, geo-location, online communities and yes, even the humble survey. The transformational impact of the social media age cannot be overestimated; it is indeed pervasive.
To put it in perspective, consider this quote in Digiday from Marc Pritchard, Global Marketing and Brand Building Officer of P&G regarding the company’s view on how technology (social and mobile especially) is changing the game for them:
“To address these [technology] forces, our vision is to build our brands through lifelong, one-to-one relationships in real time with every person in the world,” (emphasis added) Pritchard said. “The power of everyday people is driving monumental change and people power favors brands like ours. We have trusted brands that are part of everyday life. We genuinely care about serving people with superior benefits and doing good.”
“Technology will mean that people will increasingly expect brands to understand their unique needs and deliver,” Pritchard said. “We want P&G to be the first to create this trusted, indispensable relationship because it will create greater loyalty, more purchases across categories, and more sales at lower costs. Achieving this vision requires some fundamental shifts in how we operate.”
The essence here is that social and mobile technologies will be leveraged to drive engagement and understanding of consumers at a previously unimaginable level, increasing basic business value. That has profound implications for the insight organization on both the client and supplier side of the value chain.
The quote above illustrates the role that big data (with social media data being a huge component) will play in helping to inform a significant aspect of the “one-to-one” model by default. And this will force other changes that are already beginning to play out via new entrants, such as Facebook, Google and Twitter, in the existing MR space. Consolidation between community providers and consulting organizations, and partnerships between social platforms and research companies are also well underway.
There is a realization in the business community that insights do not arise solely as a product of traditional research models. Instead, these new platforms are providing tools for discovery and ideation that are faster, more unexpected, more customer-focused and quite often, more productive than existing approaches.
Enterprise organizations acknowledge that the accuracy with which a company can understand its customers determines that company’s success (or lack thereof). But the problem for them is that there are a lot of customers, they have a lot to say, and they don’t necessarily say it clearly. Indeed, the customers themselves may seldom know what they mean, what they want, or what they would do if they got what they wanted. To complicate matters further, customers also experience a wide variety of feelings and attitudes toward the products and services they procure, and often are (1) somewhat unaware of the origin or complexity of their feelings, and (2) do not know how to articulate their feelings in a way that is informative to companies that care about allaying these feelings and improving the experience of the customer.
At the end of the day, companies want to provide for needs and wants in a way that makes economic sense.
The perfect way to do that is to understand a consumer’s individual needs and meet them. This means having a one-to-one understanding of each of these people, and execute accordingly. This approach covers the first part of the premise I laid out, but it doesn’t do much for the “economic sense”.
In order to balance the two components, we have said “let’s group consumers by similarities in how they connect with our product”. Segmentation improves “economic sense” by addressing a larger audience, thus economies of scale. At the same time it improves “economic sense”, it detracts from providing for needs and wants because the group is never the same as the individual – similar on some criteria, but not the same on that criteria nor similar in other ways.
To gather all the information we need to understand the individual (in the context of a relationship with a product) would have been a difficult task even if it was just a small group of people. But we are talking about almost 7 billion people. Those people are different from one another on a huge number of axes – economics, education, values, health, emotions, family, opportunity, culture, physical nature, life stage, housing, etc.
The nature and sheer volume of today’s raw data is the first step on a path that can move us closer to understanding these 6+ billion people. But the data is still relatively raw and unstructured. The next major steps are to bring the data together, analyze it, and interpret it into the needs and wants of individual consumers. We are not there yet. Nor is the mass customization that is necessary to make “economic sense” out of individual consumer needs. The interim suggests smaller, holistic, relevant segments – not a bad step and not a pipedream. But bear in mind, it is only an interim step. The ability to bring it all together at scale is rapidly coming .
To date a variety of “traditional” market research approaches or business intelligence approaches have been the primary means of trying to understand these variables. From surveys to focus groups, neuromonitoring to facial scanning, and from data mining to predictive analytics brands have had many tools available to try to understand and structure the process of insight generation.
The issue for brands is complex, but the goal strikingly clear: to understand its customers, they need to listen to them, and not just what they have to say, but also why they feel they have to say it. And, they need to do it in real time and at web scale. To do that, they need a model that allows for the aggregation & synthesis of data into (to borrow a term from my friend Alec Maki) a common information framework.
This has been a challenge for other methods due to issues of cost, scalability, sampling, and hundreds of other business and procedural issues. We are seeing a future unfold right now where the establishment of common information frameworks will function as the central driver of business insights across the organization, and the traditional role of research professionals will evolve to be the facilitators of getting to the “why” of the business issue rather than the collectors of the “who, what, when, where, & how” of data.
With such realities in mind, Gartner estimates that by 2014, 30% of analytic applications will use in-memory functions to add scale and computational speed; 30% of analytic applications will use proactive, predictive and forecasting capabilities; and 15% of all behavior intelligence deployments will combine behavior intelligence with collaboration and social software and become decision-making environments. Thus, with the requirements for real time assessment, the massive volume of data, and the exponential growth of client expectations, companies’ need to look beyond the status quo for advanced text analytics and high-performance structured data computing techniques. That is the combination that will drive the future.
Don’t take my word for it though: check out this news yesterday: Facebook to Partner With Acxiom, Epsilon to Match Store Purchases With User Profiles. Or two stories last week from Nielsen: Nielsen Ratings Change: Broadband, Console Viewing And More Measurement Coming For TV and Billboard and Nielsen Add YouTube Video Streaming to Platforms. And finally, this gem from Raytheon: Raytheon’s Riot program mines social network data like a ‘Google for spies’. I could cite many more examples, but I think you see where this is going.
What is emerging is the rapid growth of new entrants that combine the platform scale and efficiency with analytical breadth to process vast amounts of aggregated and synthesized data in real time and deliver real insights to clients. These platforms will become centralized data engines that are integrated with multiple data streams to become one hub where all content can be aggregated and processed. Such advancements allow for the real time processing of sophisticated algorithms on huge volumes of both structured and unstructured data that will take marketing insight organizations well into the next decade.
The biggest gap in the industry right now is around this concept of the common informational frameworks. It is the missing contextual piece for big data and is the unifying model that gives MR (in all its forms) a chance to really deliver value in the future. From my work with client organizations the demand is loud and clear: to holistically understand consumers across all brand touch points and to do that they need a real structure to synthesize all of the data coming in from all of the various old, new, and future streams: that is the informational framework. Research suppliers, social media analytics, CRM, etc… just become plugins at that point which deliver discrete value as well as aggregated value.
That is what the MR industry need to be working on, and I think it is a HUGE glaring hole in our collective discourse right now. The focus is on theory, methods or tech, but not so much in the merger of the three to deliver business value. Many non-MR firms and at least a handful of traditional players are working on this problem to provide not just the data feeds from many sources, but also the framework to analyze it and deliver context focused on client issues. When these firms get it scaled and the kinks worked out the MR world will change dramatically and many folks will be left scratching their heads wondering where their business went.
The curve has suddenly gotten steeper. People are changing faster. Products are being created, or are evolving, at an astonishing rate. Marketing and market research must simply keep up. Or be replaced with something else that can tolerate the rate of change.