Editor’s Intro: Michael’s post is eye-opening. Many have written about the potential benefits of combining and analyzing large data sets. Realizing this potential has been difficult because of the technical challenges involved. The software Michael reviews is an early entrant in the race to overcome these challenges. There is no doubt that we will see other entrants soon as well. This all seems very exciting, although as Michael notes in his last paragraphs, there will likely be some unfortunate side effects as well. Still, I could argue that they will be less severe than he fears. New capabilities have a way of creating new opportunities.
I recently had the privilege of reviewing a technology called Sherlock. Sherlock is a Decision Support Platform from a company called Penser Analytics out of Bangalore, India. A major part of Sherlock is an Artificial Intelligence or AI engine. Here I would not only like to review some of the cool things Sherlock can do, not just because it is pushing back the frontiers of advanced analytics, but also because the technology represented here has the potential to have a major impact on how we do things and how many people we are going to need to do them. To this end, it will raise some very important questions that we will all need to inevitably address.
Two Modules for AI
Back in my early career working in the marketing research department at the Kellogg Company, Decision-Support Systems (DSS) was a cool and sexy subject. But basically, all these mainframe systems could do was generate reports and tables of either Nielsen or SAMI sales data. Any analytics required special code to export to SAS or SPSS. Today, however, platforms like Sherlock can integrate data from many different sources and platforms; and the analytic tools reside inside of these systems. For Sherlock, an AI engine is built-in to both the data acquisition and analytics phases of this platform.
1. Data Acquisition Module
The first module of AI in Sherlock focuses on the somewhat mundane “data acquisition” phase. As illustrated in Exhibit 1, the AI engine in Sherlock automates the acquisition of many forms and formats of data; and it does so automatically and seamlessly.
The ability to standardize and harmonize disparate data forms “automatically” might not seem all that spectacular, but for the folks who build and manage large data warehouses of Big Data, it is a very big deal. Sherlock is designed to ingest and harmonize data from any source and has the built-in intelligence to perform these difficult tasks.
To illustrate this, for one Sherlock client in India, every quarter a team of a dozen analysts had to compile lots of disparate sales and inventory data across thousands of retail stores. All of these data came from very different platforms and, frankly, it was a gigantic mess. It took these dozen analysts 13 weeks to compile all of these records into a complete and integrated portfolio of reports. Yet, by the time they could report on the prior quarter, they were into the next quarter. Management obviously found this unacceptable. But challenged with this same chore, Sherlock completed the entire task in less than 45 minutes. In total, in this instance, Sherlock saved over 6,000 hours of work and provided a net savings of almost 125,000 US dollars.
2. Automated Analytics Module
The second module of AI is perhaps a little sexier and occurs in Sherlock’s “automated analytics applications”. There are over a dozen of these ranging from customer segmentation to price elasticity. Here, I would like to share a few of these.
Automated Campaign Sales Lift Analysis
Exhibit 2 shows a time-plot representing the incremental sales lift from a marketing campaign. Sherlock has an automated marketing-mix modeling capability from which to do these types of analyses. These automated models are able to filter and control for all drivers and focus only on the sales-lifts from the specified campaign, as shown in this chart.
But Sherlock’s campaign analysis is also capable of evaluating “online or e-commerce” campaigns using its own digital attribution model. As shown below on Exhibit 3, we can trace the flow of web-traffic toward orders or checkouts of various products & campaigns, and as affected by various online marketing touch-points or display ads.
This chart illustrates how online traffic flows through the website or online display ads, all leading either to a custom order for pick up at the store or an online checkout for direct mailing. In the end, attributions and quantifications of drivers of growth are summarized through this waterfall chart shown on the chart to the below.
Automated Drivers Analysis of Panel Data
Another analytics module is perhaps more relevant to marketing insights folks. In Exhibit 4, we see results of an automated model of respondent-level consumer panel data. This chart also illustrates some very creative ways for displaying information and insights. In this analysis, for a single consumer brand of shirts, Sherlock mined the respondent data along with various other data on advertising exposures. It was able to determine and rank the relative impact and influence of various drivers in terms of their probabilities to predict customer repeat purchase of men’s shirts. The circular spiral graph on the right plots each repeat-purchase driver represented by the colored balls. The circular spiral distances represent the probabilities and relative influence of each driver. As the variables or drivers spiral towards the center, they progressively represent drivers with greater influence or probability. All of these variables or drivers was determined by Sherlock’s own automated & AI-driven analytics routines.
AI and the Future of Marketing and Analytics
The technology I have reviewed here probably represents one of the first of a flood of tools which do similar things: these will provide seamless access to all forms of data and will automate “reasonably complex” analytic tasks. Like many other technological phenomena, this is likely to happen, whether we like it or not; and we better make the best of it. It is clear that this tool has the ability to increase the market penetration of analytics, and that is a good thing. However, it is also clear that AI tools such as this have the capability to displace people and jobs. For the present, there is a very strong demand for analytics talent in the market. There is really a pretty large shortage and some open positions are not being filled. Clearly, tools like Sherlock can help fill these gaps; but, let’s not be fooled, jobs are likely to be lost. So, we should be good scouts and “always be prepared”. This means educating ourselves on AI and machine learning and trying to manage the roles that it plays so that it truly is a value-added capability and not a just a cost-saver for reducing head counts. The focus needs to be on “systemizing the predictable while humanizing the exceptional”. Above all, the best advice is, even when it comes to your own job and profession, never take anything for granted.