The ultimate guide to transforming your business with business insights

Innovation or Sales Pitch?

Silly or misleading claims and outright falsehoods can discourage adoption of useful new tools and overhype risks backlash. Nevertheless, something slickly marketed or hyped to an irritating degree in fact may work well and be worth its price tag.


By Kevin Gray

Openness to change is not blind acceptance of claims.

“What you have is old.  I have something new and better.”  This may well be true but it is also a tried-and-true sales pitch.  Talking about “new” is O-L-D.  Though I strongly feel we should keep our eyes, ears and minds open for things that will help us live happier, fuller and more productive lives, there is no need to believe everything we read or hear.  If it seems too good to be true, most likely it is.

The fact is things can be old and good, old and bad, new and good or new and bad.  What’s more, something positioned as innovative might actually be an old idea that has been repackaged and recycled.  (If you point this out, you may be greeted with a retort offering quite trivial modifications in support of the argument that this time it really is new.)

Some innovations don’t diffuse very far simply because they aren’t very good ideas.  On the other hand, new ideas can fail, not because they’re bad ideas, but because they are difficult to comprehend or put into practice.  New products or processes may address real and important needs but may be too complicated for the intended user.  Others fail because they have been poorly marketed.

As a marketing science and analytics person I am bombarded by sales pitches of various sorts, frequently pertaining to “new” or “innovative” methodologies or software.  I’d like to share a few thoughts about how to separate the wheat from the baloney, and I think they will apply generally, not just to my areas of specialization.

One tipoff that a claim is suspect is when the status quo is criticized…and the party leveling the criticism has gotten the status quo wrong.  How can you think outside the box if you don’t know what’s in it?  We shouldn’t take for granted that another person knows our job better than we do.  Generalizing from the exception is another tactic to watch out for, and sometimes very poor practice is presented as standard practice.  Clearly, many things will be an improvement over incompetence.  Consumer surveys, some of which are very badly designed and executed, are a case in point.   In general, though, they do work and are still essential even though they are “old.”

Be on the lookout for the words “advanced” and “world class.”  Like “innovative” and “revolutionary” they are hackneyed and have lost much of their meaning.  One example is “advanced analytics” software that, in reality, only offers cross tabulations and graphics, and another is software that mainly consists of standard routines wrapped in a flashy package.  They may be solid products, but no better than what you already have on hand.  Don’t allow yourself to be dazzled; this of course applies generally, not only to software.

“We are the only ones who can…” should get your guard up as much as “99% accurate.”  One is tempted to wonder if the reason no one else does it is because it doesn’t work.  Ostensible benefits of new a technology often are really camouflaged claims about its hypothetical potential, not what in fact it has been proven to deliver.  “Validated” is another word to be wary of.  How is valid defined?  Who did the validation and what process did they follow?  Has the validation been replicated?  More to the point, can the new product or system really do what the folks pitching it claim it can do?  I recall a rather caustic but telling comment made by another marketing science person: “This algorithm is fantastic!  Can it also forecast how many suckers will be born in the next hour?”

Dubious claims sometimes conceal themselves behind academic or scientific authority.  While endorsements from true experts are impressive, should any substantial investment be required please take it upon yourself to find out who the real experts really are and what they are really saying.  Also, don’t be taken in by impressive paper credentials; ethics are not correlated with mathematical prowess or programming skills.

Some pitches attempt to bewilder us us with complexity, perhaps in the hope we won’t look too closely or ask too many questions.  What is being pushed apparently is not only new but so complex and sophisticated that the ordinary Joe will never be able to get his head around it.  (Therefore, if he buys it he joins an elite cadre!)  Those adopting this sales strategy typically lean heavily on jargon and tend to dodge specifics.  Don’t allow yourself to be intimidated; try to pinpoint concretely what this new product, service or process is supposed to be able to do and whether there is genuine evidence it can deliver on these promises.  Or just ignore it.  This classic Monty Python skit is a wonderful parody of the tendency of the chattering classes to over-intellectualize:



Popular business media are another excellent source of nonsense. “Companies that do XYZ are more profitable than companies that don’t do XYZ” is not evidence that XYZ works.  It is merely a sentence written in English.  A few obvious questions should come to mind.  Specifically, how is XYZ defined?  How is “more profitable” defined?  How did the two groups of companies differ before it was adopted?  What about performance over time? Average profitability for companies doing XYZ might actually have decreased since they adopted it!

Some claims are self-repudiating almost to the point of farce, for instance, eloquently-written pieces asserting that humans cannot express themselves well verbally.  We may be told in-depth interviews don’t really work but, inexplicably, text-mining Twitter with their software does.  It seems we’ve been deceiving ourselves all these years.  Some pitches for biometrics make similar sorts of contentions, neglecting that their development may have required exhaustive interviews with test subjects.

Along similar lines, that humans are not perfectly rational is not news, nor was it was when Sigmund Freud was a lad.  My reason for bringing this up is that every few years it seems we are informed, once again, that it has been discovered that humans do not always shop very scientifically and therefore that conventional marketing thinking and practice are wrong.  I suspect, however, that tail fins were not installed on automobiles in the 1950’s for purposes of aerodynamics.

Silly or misleading claims and outright falsehoods can discourage adoption of useful new tools and overhype risks backlash.  Nevertheless, something slickly marketed or hyped to an irritating degree in fact may work well and be worth its price tag.  Being open-minded means being open-minded and our decision ultimately should boil down to: “Will I get what I’m expecting and will our investment, including my time and my staff’s time, pay off under the constraints we work?”

Look closely and ask hard questions.

Please share...

14 responses to “Innovation or Sales Pitch?

  1. Hi Kevin:

    You certainly have covered a lot of ground in this post. If there were any doubts about the ineffectiveness of mass marketing, they should be put to rest with this article. There have always been and always will be, Charlatans, but more often there are people who truly believe that there product is a solution, and for some audiences it is; otherwise it fails. The common denominator between the two is the zealous “over the top” pitch.

    That’s where listening and details become important. It’s easy to pick apart a pitch and make all of the assumptions that you outlined above, it’s harder to listen and find the diamond in the ruff. Anybody with industry or product knowledge can identify a winner, but those types of solutions are almost always broad and while profitable, they usually aren’t revolutionary beyond the tactical.

    In most cases the effectiveness of a product or service lies in the users ability to understand and maximize the value of the product. Therefore, the ability to connect with the buyer and explain how profit or time savings, etc. can be achieved with a product is critical. If the first 30 seconds of the pitch isn’t effective, then the opportunity is lost for both parties. That means the pitch has to be on target and resonate with the recipient. Often, paper pitches have several messages and the net result is to turn the potential buyer off. In cases where the pitch person is given the opportunity to research the client, he or she often knows their financials (as published) and their pain points and is much more effective.

    For small companies, start-ups, especially those rich with technical features, the owner is often the pitch person. They have such an investment in the product, that they often think if it is built, it will sell. That results in low exposure and a lot of pressure on the people who are in talks with them. That’s why the Steve Jobs and Elon Musks of the world are so valuable. They are visionaries and business people. They can demonstrate what the product can or will do in terms of the application and specifically to each buyer.

    The bottom line is that it’s always about what a product or service can do for you. Whether you are the seller or the buyer you have to see the benefit. All the rest if just finding ways to maximize opportunity or competitive advantage, on both sides. It really is that simple. That’s why the KISS principle endures.

  2. Not sure if this will be of comfort to marketing researchers (or anyone, for that matter), but FYI I shared this post with a contact in the scientific community (physics, mathematics, computer science) and the reaction was “Ah – the history of secrets of snake oil. I agree with every word – the same tendencies creep into 99% of the science proposals I see.”

  3. This is excellent Kevin. Science grows slowly and laboriously. Most progress is incremental. Validation needs substantial replication and takes time. Only occasionally do we have discontinuous progress. Fifty years after the introduction of conjoint, we are still refining and extending it. So discrediting what is “in the box” without understanding what is in it is a sign of low level literacy. It is amusing to me when someone (who has been in business for perhaps five years) implies that I got it all wrong when I have devoted over three decades of life in understanding methodological issues of research. I do understand that, when you work for a company, some sales pitch goes with it. But surely it can be moderated with some intelligence. I do see a lot of “innovative products” being offered without any serious validation accompanying it, what I call “models without facts”. Sometimes the mathematics behind them is dubious as well. Your piece is eloquent. Thank you for writing it.

  4. @Ellen – I think you’ve missed Kevin’s point, which is surprising given how well it was presented. A belief that “it’s always about what a product or service can do for you” is totally not the way to vett a new research product. Anyone can tell you that they’ve built the greatest tool since sliced bread and it will solve your problem(s) better than any other tool. The question Kevin is raising and I’ve raised in my blog here recently is whether this is true? When we developed products in the 1960s through the 1990s, the first question clients asked us was, “so how do we know it works”? Satisfying that, the conversation moved to, “how do we know it works better”. Whether you, the buyer, find it useful is not a valid measure of a tool’s validity or its reliability.

  5. I’ve written about these topics in bits and pieces here and there, but it was Steve’s excellent blog in this space a few weeks ago that motivated me to polish up a rough draft I’d back-burnered. Apologies to anyone if my main points were unclear, however. There are a lot of snake oil peddlers out there, as well as people do don’t really know themselves what they’re selling. Either way, we need to be very careful about what we are buying, and I hope some of the tips I’ve offered will be useful in this regard.

  6. @Steve – I think we have a semantics issue here. You can’t discuss how well something works unless somebody is interested in it. The initial pitch has to be a value proposition and then you can talk about measurement. Further if you simply try to measure quality and improvement value, it would be difficult to justify anything new or untested in the market.

    Having said that, I believe that Kevin’s point was basically not to judge a book by its cover.

    Value isn’t just one thing and while something may be of little value to one buyer it is a great value to another. Simply measuring like we did in the 60s or 90s when everything was measured with a mass market mentality isn’t what buyers do today. They are savvy and they understand technology and how products improve their existing platforms and methodologies. It’s an ecosystem and the buyer is looking for stronger, quicker and more efficient pathways to deeper analytic assessments. No research manager I know simply accepts claims at face value – but the real difference is that expansion of data points is no longer the goal and is sometimes not advantageous; rather it’s the ability the product or service brings that enables or creates better focus within existing data points. I for one, have a lot of faith in buyers intelligence and the belief that most sellers are interested in providing a benefit.

    Snake oil providers target the least intelligent and they don’t like to think very deeply. That’s the value of education, it separates the thinkers from the bottom feeders. They may not always express it well, but most in this business are in it with good intentions and those who listen and adjust get to stay – on both sides.

  7. @Ellen – my point, and part of Kevin’s point, is that indeed there are a lot of people selling crap research out there. And of course you can discuss how well something works without someone being interested – to thrive or survive, you may have to convince them why they should be interested. You also way overestimate the sophistication of research buyers; they’re not dumb, but they often do not have the background to understand what’s really there below the glitter. Neuromarketing has its fair share of charlatans who are selling garbage and when they start talking fMRIs and neocortex activation, they sound like they know what they are talking about and it’s the rare client with a degree in neuroscience or physiological psychology. How many of your clients can tell you what the different rotation methods in a factor analysis mean or how utilities are computed in a conjoint experiment.

    Snake oil salespeople target everyone in the industry – go to the conferences and watch them in action – they are not discriminating. Sure, some good people are misguided in what they are selling and don’t mean to sell something bad. Their good intentions doesn’t make bad research any better.

  8. Wow – I think you pretty much covered the entire industry with that comment. If I read your comment correctly, good companies should just keep talking until someone listens while those listening may not really understand a lot of what they are listening to and the messages are not really accurate anyway and the value of the techniques would be questionable even if those selling them understood them and those listening probably could not properly evaluate or execute them anyway???? Is there anything good left for us to hang our hat on? Hint: It’s not statistics base on stated response.

    That makes a pretty good case for CDOs and big data to bring the value (or least the accuracy) doesn’t it? I think the real problem is not the knowledge gap (present in sellers and buyers), but rather that stat measurements don’t deliver well on small discreet samples, especially when they aren’t well represented.

    As for neuro testing, I am a biological science major; worked in healthcare for several years, and spent a lot of time around neurology labs and MRIs when my 4 month old started having non-febrile seizures. Medicine has always diagnosed based on what something is NOT more so than what something is – so it’s not surprising that neuro testing is a tough call when it comes to consistent findings. The only medical discipline I can think of where definitive diagnoses are made is in radiology. But even then they don’t treat the whole human or know how best to manage the condition and most treatments don’t come with a guarantee – ah that knowledge gap appears again.

    The encouraging thing about many newer marketing techniques is that they aren’t trying to boil the ocean but rather are supplemental to the bigger picture. Things like crowdsourcing, GPS mining, text analytics and many others don’t proclaim to be anything more than supplemental but using those techniques to avoid stated response is creating a new paradigm and like it or not it is the future. While there will be failures and fallacies, there will also be successes and new windows of opportunity. One great example is Jane McGonigal’s gamification work in the rehabilitation and stroke therapy arenas. While not strictly about research, her techniques are now being applied in a number of areas to increase participation and compliance. Lots to be learned there.

    For that matter, just look at how far Internet has come in the last 20 years and think about those who have come and gone and failed only to watch others build on their ideas and succeed. Remember how exciting “you’ve got mail” used to be. Smart people and good ideas find a way to succeed, even if it’s by putting one foot in front of the other rather than running a marathon. Smoke and mirrors on the other hand, are taken by the wind and scattered into dust. The cream does indeed rise to the top, sometimes it just takes awhile.

  9. Hi Kevin.

    I really do think this is a really spot on.

    I find that most of the battle is not resistance to analytics because the topic is challenging for the economic buyer to grasp in the short amount of time they can dedicate to you. I think there has been a real awakening on that front. The uphill battle is often the all too quick response of “we tried that with another vendor and it doesn’t work”. The problem is that the last supplier clearly over-promised on a solution that was half-baked at best and clearly underwhelmed the client in delivery.

    Let’s face it. There is a lot of this out there as other her repeatedly point out. It turns the conversation on its side when the truth of the matter is that you truly have a fully-baked solution and you know it can help address their issues because very few problems are entirely new. We often have three or four examples of tacking the same exact issue (or something so close that differences are minor) over the years with different names and different faces. Instead of making those connections, we are constantly trying to overcome the barriers that the previous vendor left for the entire marketing analytics community.

    Regardless, I really think this is great and showed a lot of courage to put together. Thanks for this. I am already quoting parts. : )



  10. Last year, as a lark, I posted the following on a LinkedIn analytics discussion group:

    “…I have written software that: 1) automatically selects the data you will need for any problem you will ever encounter; 2) automatically cleans the data for you; and 3) runs hundreds of thousands of models and automatically selects the one your boss will like best. (:-) ”

    Some people evidently thought the comment was serious, and a few even contacted me for more information. This is not a drill. I repeat, this is not a drill.

  11. Thank you for your kind words and thoughtful and extremely relevant observations. I can quote you as an authority on this subject! Many of the people I encounter in marketing and marketing research fall into one or these other of two clusters: 1) those who don’t believe anything; and 2) those who’ll believe anything. The latter seems to be growing as is, fortunately, a third group, who are what I might call the intelligent skeptics segment. Some of these, though, have been burned, as in your examples. We need to be concerned that the bad will drive out the good, and some of my contacts in the Data Science community also are worried.

Join the conversation