The ultimate guide to transforming your business with business insights

The Questionnaire From Hell; A Journey Into Truly Terrible Research

When a survey is so bad that you are forced to make up stuff to move on, what will result from this “research” project is data full of lies, bogus answers, and useless information.

Sometimes I weep for this industry.

I just completed a survey (as a respondent).  I honestly have no idea how the client is going to get any valid data from the mish-mash of garbage I just had to wade through in order to complete the study.

To start, once I was already in the questionnaire, I was told (for the first time) to have my children with me when I answer the questionnaire.  Because, of course, all respondents answer questionnaires at a time when all of their kids are readily available.  Americans do nothing all day but sit around with their families, gazing longingly at their computer screens, hoping for a questionnaire to arrive that they can all answer as a good old-fashioned family get-together.

Not only was my daughter unavailable, but I input how old my daughter is, and they proceeded to categorize her in the wrong age group.  I plowed ahead alone, my curiosity piqued.

I was asked how often I grocery shop.  I am not the main shopper for my household, but my only options were more than once a week, weekly, or 2 – 3 times a month.  They gave me no choice but to put in wrong information, which I did in order to move ahead.

They asked me what types of things I collect and gave me a list of possibilities.  I collect none of them, but that wasn’t an option.  I hit the “next” button without selecting any, and was told I had to answer the question in order to move on.  Apparently, they decided that every adult must collect at least one of the options, so I chose something at random, clicked the button to register the lie, and plunged ahead.

I was then asked what characters I collect.  This time, there was an “other” option, so I clicked that and typed in “none.”  The next task, of course, was for me to rank order the importance of the characters I collect.  Not only did I get to rank order “none,” but I got to decide whether that option should place first, first, first, or first.  Really?  You can’t program a skip pattern when the respondent only provides one option?

Then I was asked about my awareness of a number of different characters that might be used as a promotional tie-in.  I gladly told them which ones I had heard of, which I liked, etc.  One of them – let’s make up a name and call it “Shiny Monsters” – I said I did not know.  Which is why I was then asked whether I would collect about ten different things emblazoned with the Shiny Monsters logo and characters.

I was also asked – and I quote – “What other characters, movies, or brands do you like?”  Huh?  Okay, I like Inspector Clouseau (character), Dr. Strangelove (movie), and Lexus (brand).  And that will help you how with this study about promotional tie-ins?

Next, I was supposed to gather all my children around me and repeat this whole useless and confusing exercise.  Since my one daughter didn’t fit the age group anyway, I just filled out the questionnaire without her (I know which characters she likes and dislikes).  But I was amazed that all of the questions asked about my children.  Let’s say I have three kids, and my five-year-old loves Shiny Monsters, my eight-year-old thinks they’re dumb, and my eleven-year-old has never heard of them.  How, exactly, should I answer all of these questions?

Finally, after answering repeatedly that I have no interest in promotional tie-ins as a way to choose which store I shop in (particularly since I don’t really do the shopping), I got a series of questions asking me whether I would switch stores in order to collect about ten different promotional products – including those with characters I had already answered that I have never heard of or do not like.

The research firm that was behind this primarily programs and hosts studies, so it was not clear whether some other research firm designed this mess, or whether the end client did.  Either way it’s a disaster.  If a research firm was behind this, it royally ticks me off that good research companies out there lose business to these charlatans.  If this was done by the end client, I cannot think of a better (or worse?) indictment of DIY research.

No matter who is to blame, what will result from this “research” project is data full of lies, bogus answers, and useless information.  I was forced to make up stuff in order to move on, and I will guarantee I am not the only one who did that.

Where is the quality control?  Where is the senior researcher who is mentoring the inexperienced ones and teaching them the very basics of questionnaire design?  (At least I’m hoping this thing was designed by someone inexperienced.)  Where is even a shred of the thought or logic that should have gone into this questionnaire?

As a respondent, I felt this was a frustrating abuse of my time.  As a researcher, I felt this was a frustrating abuse of our industry.  There is enough of this nonsense going on in research that it’s time we stand up and call out the worst offenders, and use them as examples of what not to do.  Because it’s hurting the end users of the research, and it’s hurting the professionals who actually know and care what they’re doing.

Please share...

9 responses to “The Questionnaire From Hell; A Journey Into Truly Terrible Research

  1. Good on you, Ron! No wonder this industry has such a poor reputation. The big question is how to prevent these kinds of abuses of what could have been perfectly good work. The bottom line is that you cannot cure stupidity. And that’s what drove this whole bus you took a ride on.

  2. Interesting post. I won’t argue as to whether or not this was a bad survey, but I will point out that just because you can’t imagine how the data will be used doesn’t mean it is a bad survey. Maybe the goal of the research was to see how many people would actually call out bad research? Of course I am joking, but whenever I argue in favour of credentials, somebody claims that is putting up barriers to entry…… I never think of it this way, but maybe it is time to start raising the minimal entry level requirements a tad?

  3. Rick, I agree that just because I can’t imagine how the data will be used doesn’t mean there is no use for it – I’ve seen plenty of questionnaires where I had trouble figuring out what they would learn from the data, but not knowing the objectives or the business situation, I would not judge the study to be a bad study.

    However, when respondents are forced to lie, asked completely irrelevant questions that contradict what they’ve already answered, and put into unanswerable situations, then it’s not just my lack of awareness about the end use, it’s simply bad research. No matter what the intentions, bad data has no practical business use.

    On another recent questionnaire, I input my gender as one of the first questions. Subsequent questions asked me whether I, personally, was currently pregnant and whether I was currently breast-feeding. That’s just bad research.

  4. Hi Ron,
    I am now curious as to what percentage of males answered yes to the breast feeding question? I agree with you that bad research is bad. I am not as convinced in the solution…. maybe you are right, pragmatically how do we proceed? A list of bad surveys?

  5. Boy, I wish I had the answer to that, Rick. I don’t know of any blanket answer. We’re not licensed or certified the way dentists or accountants are. I think we can continue to try to point out bad efforts when we see them (to colleagues, on blogs, to clients, etc.). I know I try to explain to clients when I design a question a certain way, rather than just handing them a draft, so they’ll come to understand just how much the answer will depend on the question, and how many ways a question can go wrong.

    My hope is that some end client thinking about DIY research sees this blog and thinks, “Wow – maybe I need some professional help to avoid mistakes like these,” and that some research manager somewhere sees it and thinks, “Hmmm…I need to review the questionnaires my employees are designing to see just how well we’re doing.” I think it’s a problem we chip away at in a variety of ways, but one that won’t be solved. But any progress would be better than nothing…

  6. If I was running one of the online panels that I assume this survey was broadcast through, I would be seriously worried about the effect of bad surveys on respondents (who may think they need to lie all the time to get paid) as well as any clients who see it (I think we all do online surveys from time to time just to be nosy!). While I understand that these panels are often simply selling a service (respondents) to others, if I ran one and saw a survey like your example, I’d hope I’d have the guts to turn to my client and offer them some advice in designing a better survey instead of just letting my respondents face a terrible survey and a client think that my respondents were rubbish because their data was garbage!

  7. Samantha, you make a very valid point. Panels are in a tough spot – just how bad is a “bad” questionnaire? At what point should the panel simply refuse the business (knowing that someone else with lower standards will likely take it on)? I see bad research like this fairly often (although this one was particularly egregious), as well as absurd lengths (such as a 70-minute questionnaire, with the incentive being a sweepstakes entry). I do think that panel companies have some responsibility in all this to say “enough’s enough” when a project will be particularly abusive toward their panel members’ time and/or intelligence.

  8. No, it was most definitely a survey. Or an attempt at one, anyway. If it had come through my e-mail, I might argue that it was some horrific attempt at a marketing pitch, but this was run through a major panel company.

Join the conversation