PureSpectrum - Schedule A Demo
Qualtrics: Here to Help

Information is free. Insight is expensive. Action is priceless.

How can market research position itself in the age of technological disintermediation? Here are a few thoughts by Jason Anderson.

Toward the end of my presentation at MRMW, there were three slides that seemed to have either resonated or ruffled feathers:

The three hypotheses:

  • Information should be free. The marginal cost of collecting information is approaching zero. For a technologist, information is simply data. Data has storage costs and processing costs, but is not thought about in terms of acquisition costs. Nothing annoys an engineer more than suggesting that data should be purchased; we may pay for data for a time, if there are no other immediate solutions, but this becomes an engineering challenge: what do I need to build in order to collect that data on my own and convert this marginal cost into fixed cost?
  • Insight is expensive. Conversely, turning data into something meaningful and insightful is valuable. That’s because that is the core of what engineers and technologists do: they find scalable and innovative ways to transform raw data into something brilliant.
  • Action is priceless. Better yet is creating insights that lead directly to specific actions. This is of obvious value to the technologist, because it creates new data that can be used to build more advanced, real-time, responsive systems.

Shifting most of the perceived value into the “insight” and “action” categories doesn’t necessarily mean that that’s how the business model is written. For example, Google gives away enormous amounts of data, insights, and actionable services in exchange for equally enormous amounts of ad revenue. In games, “freemium” or “free to play” is another way of saying “give away the basic data, but charge for the good stuff.”

In a market research context:

  • “Free data” = “do it yourself data.” Using Zoomerang or SurveyGizmo or Google Consumer Surveys may not be literally $0, but compared to the thousands of dollars previously spent on such projects it’s an insignificant difference.
  • “Expensive insights” = “interpreted data.” The biggest issue with DIY research, of course, is that someone still needs to analyze the data and draw conclusions as to what it really means. Dropping survey data into a series of PowerPoint charts for each question isn’t insight, though — that’s just converting the data from one format to a different format. Distilling the data into a cohesive story uncovers insights.
  • “Priceless action” = “consultative engagement.” When the vendor can deliver specific recommendations for the business, that is priceless. When the discussion at the end of a project engagement focuses on structuring the business problem and the best possible solutions (instead of “here is the data you asked for”), that is high value.

Please share...

5 responses to “Information is free. Insight is expensive. Action is priceless.

  1. This blog post is wonderful and elegantly explains what I have been telling people for years.

    I advise customers about how they can increase their services revenue and my tag line is “Actionable Insight to Grow Your services Business”.


  2. I hadn’t seen this when I commented on Edward Appleton’s post ‘What’s is this thing called insight?’

    What Jason says is a more extreme and definitely more articulate version of what I was trying say.

    I seek to uncover insights by studying data that exists already and/or by creating new data. For me the real value add comes from collating and interpreting this data in ways that my clients/colleagues couldn’t do for themselves and by identifying the best ways they can use it to achieve their business objectives. Insight doesn’t have to be big or game changing but it does have to win hearts and minds and inspire action within client organisations. This is easy to say but hard to do, which is why much great research just sits on the metaphorical shelf.

  3. Jason succinctly encapsulates what the role of the market analyst is in the 21st century.

    Businesses base their strategy on assumptions and biases about products, customers, and markets. These long-held beliefs prevent the businesses from objectively viewing current market information.

    Some of my clients have expected a certain outcome from a market research project (Customers will love our new product.) and, thus, not perceive what the data really says. (Customers think the new product won’t work for them.).

    A professional market analyst has the objectivity and experience to be able to ask the right questions of the right people, interpret the data, derive insights from the data, and then suggest action plans that are based on the data.

  4. Like the curates egg, Jason’s views, as outlined above, is good in parts, but bad in others. The cost of SOME data collection is approaching zero, other data remains expensive. A survey of people on your mailing list can be valid and cheap, a survey of people with no credentials is of little value and cheap. However, conducting in-home ethnography with trained researchers is going to be expensive.

    In general processes can be cheap but people are expensive. So, if a project requires skilled people to create it, or to run it, or requires a lot of respondents’ time it will be expensive. In most single country, consumer research projects the fieldwork (Jason’s data) is already one of the cheapest components, online panel respondents are paid very little. There may be scope to reduce this further, but it won’t change the cost of running projects much.

    DIY is a growing phenomenon and one to be welcomed, but it has its limits. I cut my own toe nails and finger nails, but I pay somebody to cut my hair – I could cut my own hair and save money, but the quality of the results would not meet my minimum criteria. Survey design systems and in particular libraries and wizards will improve, but there is a real skill in designing a study to meet business objectives. The more complex the need the greater the skill. A simple customer satisfaction study should be capable of being designed in-house, a complex Discrete Choice Model is going to require a marketing scientist to design and to interpret.

    Jason seems to think that analysis starts once the data has been collected, but the analysis starts when designing the study, with the wrong data the right analysis cannot be conducted.

    And finally, actionable insights are not priceless, first they should have a very clearly calculated ROI, and secondly they are only useful if they are actually implemented – which brings us back to a point Edward has raised before about whether the researcher should go further than spelling out the recommendations to trying to persuade the organisation to implement them.

Leave a Reply to Ruth Winett Cancel reply