With the explosion of social media, mobile computing and content in general, businesses are now finding that they are behind the curve in analyzing the vast array of content that is being thrust upon them. Therefore, it should not be surprising, that what businesses ostensibly seem to be reaching out for is a solution that synthesizes technology and insights from computer science, linguistics, and cognitive science. This synthesis has been broadly defined thus far as “Social Media Analytics”, although the applications go far beyond the corpus of information contained within the world of tweets, status updates, blog posts, comments and reviews. The real goal of these linguistic based analytical solutions is to produce that nirvana of both insights professionals and marketers: Intelligent Marketing.
The recent closure of early entrant and one-time leader NM Incite and their Buzzmetrics tool into this space shows how rapidly the sector is changing due to client demand for more sophisticated solutions that go beyond simple sentiment and keyword counts. Similar to any technology investment, early platforms that were designed with inherent limits in scalability, upgradeable infrastructure and core programming rapidly lose value when new players enter that can deliver on the promise of “cheaper, faster, better”. Social Media analytics may be highly disruptive to traditional market research, but the category itself is now being disrupted by the new breed of platforms that incorporate a more holistic “Big Data” framework.
Just one recent example of this new development in action is the announcement last week of the launch of Pulsar TRAC by FACE:
Born out of 10 years experience of research and planning with social data, Pulsar TRAC (TOPIC, REACH, AUDIENCE, CONTENT) is built on a robust intelligence framework enabling marketers to do more than just keyword tracking: measuring the reach of conversations, mapping brand audiences and tracking content diffusion. Whereas all traditional social media monitoring platforms look at the content of conversations, Pulsar TRAC takes monitoring a step further by indexing and analyzing the social connections, the interests and the digital behaviors of the authors, as well.
“Face’s Pulsar TRAC is invaluable for identifying real-time insight into the way that our audiences are engaging with content and stories,” said Justin Wyatt, Vice President of Primary Research at NBC Universal. “Having a real-time, real-world view of how content spreads and creates connections is vital from an insight perspective. The key difference with PULSAR TRAC was that the platform offers a high quality social media insight system, supported by analysis that created meaningful stories from the data. Face also ‘connects-the-dots’ between these analyses and other qualitative and quantitative learning to offer a holistic view with clear actionable steps for our business.”
Pulsar TRAC solves many of the issues found in the more than 200 social media tools currently on the market, such as the obsession with basic volume-led metrics, the lack of demographic and behavioral context, no understanding of the audience, poor interfaces, and the inability to weight the impact of conversations.
It’s great to see an established MR firm show leadership by embracing the next generation of holistic social media and big data analytics. They join the ranks of Gongos O2, Motive Quest, KL Communications, Anderson Analytics, Research Now, Decooda and a hand full of other pioneers in blazing the trail of what the research firm of the future will look like.
Now full disclosure: I do have business relationships with several of these firms, so I mention them because I am intimately familiar with what they are doing and believe strongly in their value propositions. A quick web search will show many other examples of new entrants and rapid changes in the marketplace that are impacting the current state of the industry and suggest probable future paths for development as we see maturation of other convergent technologies that are poised to create further disruption into the traditional business insights and marketing wheelhouses.
What’s driving this shift? As always, it’s supply and demand.
Brands acknowledge that the accuracy with which a company can understand its customers determines that company’s success (or lack thereof). But the problem for them is that there are a lot of customers, they have a lot to say, and they don’t necessarily say it clearly. Indeed, the customers themselves may seldom know what they mean, what they want, or what they would do if they got what they wanted. To complicate matters further, customers also experience a wide variety of feelings and attitudes toward the products and services they procure, and often are (1) somewhat unaware of the origin or complexity of their feelings, and (2) do not know how to articulate their feelings in a way that is informative to companies that care about allaying these feelings and improving the experience of the customer. To date a variety of “traditional” market research approaches or business intelligence approaches have been the primary means of trying to understand these variables. From surveys to focus groups, neuromonitoring to facial scanning, and from data mining to predictive analytics brands have had many tools available to try to understand and structure the process of insight generation.
To demonstrate the relationship between the needs of brands and the facets of social media analytics, let’s look at the advent of “integrated social marketing”. The suppliers of technology that enable this new model (social media analytics firms) typically analyze social media texts (e.g. tweets, Facebook status updates, and blog posts), with the goal of synthesizing huge amounts of data into actionable information for brand marketers. This information includes the analysis of themes, topics, and contexts. Having more information, particularly more accurate information, enables companies to make meaningful strategic and real-time engagement decisions. Specifically, brand marketers use this information to make recommendations, accurately predict marketing outcomes, and intelligently invest in marketing strategies.
For a great deeper dive analysis and series of case studies of how brands are using these technologies to reshape their insights organizations, you should check out the latest Gen2 Advisors report: “From Online Chatter To Meaningful Insights” (yes, I am a Principle in Gen2, but it’s a great resource I would recommend no matter who wrote it). Alternately, Gen2 has also produced an excellent free “How To” Guide: “Building Your Social Media Insights Program” that is well worth the time to read. If you want to go ahead and buy the full report, use the discount code GB10 for a 10% discount; you can thank me later.
Broadly, the operational goals of these companies form two major tasks: (1) Gather the information, and (2) Make the information meaningful. The first task is computational, and the second task involves a marriage of linguistic and cognitive resources. Social media analytics companies analyze data for information that may serve to better direct the actions of their clients. In principle, the more data a provider has access to, the better the information it is likely to provide. However, the law of diminishing returns determines that large amounts of data are only useful if a viable platform is available to power both the volume of the analysis and also the range of sophisticated algorithms that constitute the analysis. In simple terms, the intelligence of the marketing solution depends first and foremost on the power of the platform. Again, looking at the announcement from Face as just a recent example, the new players in the merging of social media analytics and market research get this.
Of course, not all social media analytics companies have powerful platforms. Indeed, many operate through data sampling procedures and hand-crafted analytics. Such approaches are desperately slow, and what might seem to be gained by way of a personal touch is equally lost by such attributes as exhaustion, boredom, simple human error, time and money.
But even if humans could maintain an indefinite level of expertise, they could never hope to operate at sufficient speeds or with sufficient endurance to process the oceans of data that constitute the social media market. This market includes Facebook status updates, Twitter tweets, blogs, forums, on-line community panels, and verbatim responses to survey questions, with the collection of data they produce amounting to over two-trillion gigabytes annually. The scale here is tremendous, and has meant that even relatively well developed early entrance platforms have had to rely on tools and techniques that are only adequate for off-line, sample size analysis. Naturally, the world doesn’t happen off-line, and brands don’t move forward by hearing yesterday’s news. As such, businesses need to employ technology that meets the realities of today’s real-world problems.
With such realities in mind, Gartner estimates that by 2014, 30% of analytic applications will use in-memory functions to add scale and computational speed; 30% of analytic applications will use proactive, predictive and forecasting capabilities; and 15% of all behavior intelligence deployments will combine behavior intelligence with collaboration and social software and become decision-making environments. Thus, with the requirements for real time assessment, the massive volume of data, and the exponential growth of client expectations, companies’ need to look beyond the status quo for advanced text analytics and high-performance computing techniques. That is the model that will drive the future.
What we envision is the emergence and rapid growth of new entrants that combine the platform scale and efficiency with analytical breadth to process vast amounts of data in real time and deliver real insights to clients. These platforms will become centralized data engines that are integrated with multiple data streams to become one hub where all content can be aggregated and processed. Such advancements allow for the real time processing of sophisticated algorithms on huge volumes of unstructured data that will take marketing insight companies (as well as interested academics) well into the next decade.
Recent technological developments have allowed for a notable development in the power and availability of computational textual analysis tools. Computational textual analysis tools are systems or approaches that may be applied to exceptionally large volumes of text (e.g., streaming tweets or self-contained corpora). The outputs associated with this analysis may be values (suitable for statistical modeling) or entire narratives (suitable for teams of marketing staff to assess potential impact). What is needed is a textual analysis tool that offers specific lexical output (in addition to quantitative output) that provides brand marketers with actionable insight based on emotional understanding of the individual, not just the words used. The integration of “big data” scalability and sophisticated textual analysis with cognitive and behavioral psychology is the focus of the new class of players entering the marketplace, and they will define what the future of this space looks like for years to come and will continue to be a highly disruptive force for traditional research and legacy providers of business intelligence and social media analytics.
The good news is these firms are emerging right now and rapidly growing. There are two events coming up that will be a great way to get exposure to them first hand so you can judge for yourself: The Sentiment Analysis Symposium by Seth Grimes and our own Insight Innovation Exchange.
The May 8, 2013 Sentiment Analysis Symposium in New York will be the 6th instance of this premier business-focused conference, the only conference focused exclusively on teaching you about technology and solutions that help you discover business value in opinions, emotions, and attitudes in social media, news, and enterprise feedback.
The Insight Innovation Exchange, being held June 17-19, is a client-centric event focused on collaboratively exploring the frontiers of the future of insights and identifying new entrants into the broad space. If you want to get a vision of how new technologies like (but not limited to) social media analytics are reshaping the entire insights value chain, IIeX is the “must attend” event of the year.
We are likely to see the extension of these textual focused tools to other forms of content. Future work is likely to consider facial feature analysis, of which YouTube submissions are a prime example. YouTube has become a premier website for individuals to share (or, as some people say, “rant”) their opinions about various issues. These YouTube videos typically consist of a simple webcam pointed directly at an individual’s face while the ranting about the topics (e.g., products and services) goes on. Because these videos often consist of extreme close-ups of the face, they make a very suitable test-bed for analyzing consumer sentiment via facial feature tracking. Facial feature tracking involves detecting and tracking facial features using the eyes, eyebrows, nose and mouth, and their spatial alignment, to determine the affective state of an individual. This kind of analysis is an important goal to pursue, because analyzing YouTube videos for sentiment toward products and services is likely to be highly enlightening. Again, players in this space are emerging now (examples include Immersive Labs, Videntifier, Eye2D2 & Sticky who will all be at IIeX) and we’ll soon see these technologies becoming a major part of the research wheelhouse driven by the immense amount of video and photo data available for analysis.
Obviously, the above mentioned areas of research do not encapsulate all the possible opportunities for advancement in the social media analytics space, but I think these areas do serve as a reasonable point of departure for future research aimed at understanding customer sentiment as it is conveyed in online social media. Other areas to watch will be additional constructs such as impulsivity (that is, the lasting traits or shifting states that influence customers’ impulsive buying behavior), brand loyalty (what affective experiences cause customers to be loyal to a particular product or brand), and rewards (to which incentives customers respond most favorably). Recognizing the importance of cognitive-affect assessments is of great importance to the next generation of analytical tools and holds great promise on the evolving and advancing role of these technologies in insights generation.
The future will be very interesting indeed.