By Rich Raquet
The recent New MR Virtual Festival on presenting data had a number of really useful and interesting presentations. Mike Sherman’s presentation, “Less is More: Getting Value (Not Just Reams of Data) From Your Research” led to an interesting exchange that I think highlights the change in thinking that Market Research must make.
Mike reiterated the point that many have been making…we need to focus our reporting on the key things we learned and not waste executives’ time with a lot of superfluous information. In addition, the report should not just summarize the data, but rather it should synthesize it. He gave an example of a data set with these facts:
- Jim broke his knee
- A burglar broke Jim’s car window
- Jim got a speeding ticket.
A summary of these data might be “Jim’s knee and car window were damaged and he got a speeding ticket”.
A synthesis of that data would be “Jim has been living dangerously”.
A researcher friend told me that such a synthesis was interesting enough, but it might not be accurate. Jim might have had his car vandalized in a good neighborhood, slipped on the broken glass which resulted in his broken knee and then gotten a ticket as he rushed to the hospital. If this were true, Jim isn’t living dangerously, he just had a run of bad luck. Given the potential for other explanations, researchers traditionally limit themselves to that which can be conclusively proven.
It got me thinking about an article I read about batteries. Batteries work on a simple chemical principle that has been known for over 100 years. What I didn’t know is that the chemical principles on which batteries are based don’t explain fully the amount of power a battery delivers. Despite this, batteries keep getting better (anyone who has carried a cell phone for 10 or more years can attest to that). How is this possible?
Well, developers improve on things they know and speculate on the rest. Kind of like the speculation about Jim living dangerously. I’m sure that sometimes the designers create batteries that don’t improve on what we already have, but because they are making informed choices, more often than not they improve. If instead they acted like traditional researchers, we’d still be carrying cell phones with the weight of bricks. I think we do our clients a great service every time we use the knowledge we have to hypothesize answers for our clients.
Of course, it would be better if we could definitively prove any hypothesis we make. In the case of batteries, a physicist in Finland sought to do just that. For him, it wasn’t good enough to simply build better batteries…he needed to fully understand why they were better. In looking at the problem, he considered that another 100 year old idea might explain it…namely the most famous formula of all time…E=mc2. His theory, and so far experiments seem to back it up, is that the mass (the “m” in the famous formula) of various materials explains the difference in energy (the “E”)…in some respects it should have been obvious. This knowledge might lead to stunning breakthroughs in battery technology…Einstein strikes again!
As researchers, we too are not limited by what we are told or what has been done before. We can often find more quantitative data (other survey questions, client owned, syndicated, census, web sources) and look to qualitative sources (experts, client knowledge, verbatims) to either back up our synthesis or save us from making the wrong assumption. On top of that, we can apply analytics (old, new and newly applied techniques) to discern things that a simple review of data won’t show. In the simple example we might look to police records (what is his overall driving record, what are the crime stats where his car was broken into), medical records (assuming we don’t violate HIPAA of course) or ask questions of those who know him best. We might compile all these data and use advanced analytics to better define what “living dangerously” means.
There is certainly a lot of comfort in delivering a report where you know everything you say can be backed up with statistical precision, but if that data can’t be put into action there is little point to the exercise. Personally, I’d rather take comfort in knowing we made a difference.