10 Things I Hate About You, by Market Research. R. E. Spondent

We wonder why participation rates in research are plummeting? Here are 10 reasons to consider.

10 things


Editor’s Note: Angry MR Client is back, this time speaking on behalf of her friend Mr. R. E, Spondent. More hard truths that we need to hear more often.


By Angry MR Client

1.       You don’t understand me

You talk about me like I’m a load of stats. Yes, I am a 20-30 year old guy. Yes, I live in the city. Yes, I am single and no, I don’t floss every day. But I am much more than that. To minimize my existence to four or five things is not only upsetting, but also not true. What’s even more annoying is that you don’t think there’s anything wrong with that.

2.       You give me weird names

I’m a young professional with a degree in engineering. I love Belgian beer and have a dog named Bandit. And I’m moving houses next month. Respondent? Digi-sumer? Aren’t those Tom Cruise films? Maybe you could call me Andy, or Participant, or even a Person. Thank you.

3.       You forget about me

I know you’re really busy with your meetings and your emails and all the other things busy people do. But this is no excuse for you to not ask me what I think about your new idea. Don’t get me wrong, I do respect your “gut feeling” but sometimes I don’t even know what I want myself, so it would be quite difficult for anyone else to read my mind. At times I feel like I’d have to dye my hair purple and dance around your office in a tutu, playing the Finnish national anthem on an accordion, for you to remember I exist.

4.       You bore me

You want me to answer this? 8 times???

Crappy conjoint


5.       You ask me things I can’t remember

No. I can’t remember the last time I ordered sushi.

6.       You don’t care about me

You ask me all these questions about who I am (didn’t I already tell you I don’t have any kids?) and what I do, and how often I do this, or when was the last time I did that. And then you tell me I “don’t qualify for this survey”. Is this the Olympics?

7.       You lie to me

You said this survey will only take 15 minutes, but here I am, 25 minutes later still answering your questions, and by the look of it, I’m only two thirds through. I am tempted to report your email as spam next time you contact me.

8.       You speak a different language 

When you try to quote me, you make me sound all grown up, or like my sister, or like a teenager. Or even worse, you make me sound like you. Why won’t you let me sound like me? You not only change my words, but you also speak in a way I don’t quite understand. I have no idea what an “integrated compacting system“ is, but I will nod along so you don’t think I’m stupid.

9.       You don’t give back

So I’ve given you this amazing idea about how to improve your product. I mean, I could have done all these other things, like play Angry Birds or go get some ice cream and watch Dexter. But hey, I thought you could really use some help with your product so I gave you 20 full minutes of my life to help you out. But now it feels like all my ideas went into an Intergalactic Black Hole, since you never told me what happened to all my suggestions. Is it too much to ask for you to let me know what you did with my ideas?

10.   You think I’m a robot

Let me tell you a dirty little secret: I don’t always care about all your technical details and fancy features. Sometimes, I buy things just because they look cool. Or because my friend has them. Or because I couldn’t find your product and I can’t be bothered to go to another store. I don’t always buy the cheapest or the best product. I may be black or white, but I am definitely not black and white. Sometimes I say things I don’t mean and I do things I won’t admit to have done. I’m only human. Sorry!

Angriest regards,

R.E. Spondent


*All product categories have been made up, to protect identities


Disclaimer: I believe that clients are as responsible as agencies to take care of research participants. It’s neither easy, nor comfortable. It takes a lot of explaining and pushing back. But I’m 100% convinced it will ultimately help us get better, sharper insights – Angry

Please share...

21 responses to “10 Things I Hate About You, by Market Research. R. E. Spondent

  1. 4 of these 10 things respondents hate about market research surveys could be avoided if #MRX would up their game in communicating insights, not just to clients but, to research participants as well. Communication in #MRX is sort of like bad handwriting: it’s null & void if your audience can’t read what you’re saying.


  2. Bravo, Angry MR Client! I firmly believe that if more of us actually tried taking our own surveys (and take more surveys in general) we’d have to be honest with ourselves and say, “WHAT ARE WE DOING HERE??”

    And, because we are doing this to R.E. Spondent, is it any wonder we’re slowly but surely losing the cooperation of the people who ultimately are the lifeblood of primary MR?

    Hopefully we’ll collectively give ourselves a facepalm, wake up, and start changing the status quo. And fast.

  3. funnily enough this is what I am going to be talking about at the Pecha Kucha at the MRS conference on March 20th. It is axiomatic in brand thinking that you can’t break a brand down into its constituent pieces and look at a part without the whole. If that is true for brands then oughtn’t it to be even more true for human beings? However inconvient that is for the dice’n slice brigade who earn a living taking MR biopsies

    Here’s the pitch for March: You First OR No really – what would YOU like to talk about?

    It’s so long since research took an interest in what customers cared about and wanted to talk about that the idea seems bizarre. Why would any company ask people to reflect on what mattered to them? Warm them up, ask them a couple of leading questions about holidays and telly, get the room buzzing then let’s move to business as usual – the clients’ agenda. Carefully filtered so respondents can internalise it a sip at a time and tell us what we need to know. Why would we do any different? Well if marketing is the satisfying of consumer needs –you might do well to grasp how the consumer bit fits in with the rest of their lives and what really matters to them. You might even find needs which marketers could satisfy profitably which don’t fall in with the standard classifications of products and services. People don’t get up in the morning with an ambition to consume any more than employees go to work worrying about whether they will deliver sufficient return to shareholders. So that’s what I want to talk about: Research as if we actually gave a damn what our customers wanted.

  4. Again, an entertaining post from @Angry. But what “reality” is this referring to?? Beating up on MR is something that plenty of other people do, so why in this forum focus on bad practice? There are indeed bad examples out there, but there are plenty of good examples out there too. And to @john’s point – there are absolutely loads of Researchers who passionately care about what customers think, so to make sweeping statements about “Research” is less than helpful. I have to question the strategic intent of this blog – and that leads me to the question of authorship – who is @Angry, and if you feel so strongly, isn’t it time you said who you were so we can interpret better? What’s the Agenda?

  5. @Randi Hunton: Indeed. In our quest to get as much as possible out of participants, we forget to give back to them. Love your comparison with bad handwriting.

    @Greg Heist: Agree! And as I said, I’m not blaming it completely on mrx agencies. We’re as responsible as they are to ensure the above complaints don’t happen too often. We’re in this together: clients, agencies, participants.

    @John Griffiths: Sounds like a really interesting presentation! Your comments are spot on. That’s one of the reasons why social media research has become so popular: it taps into unprompted discussions people have in the real world, about the topics they feel passionate about.

    @edward04: I don’t quite agree that pointing out “bad practice” is wrong. On the contrary, I feel that conferences, blogs, and webinars are heavily skewed towards sharing the latest, greatest and shiniest MRX innovations. And while I’m a huge fan of added-value mrx innovation and always curious to hear about “good mrx examples”, I feel that there is significantly less talk about the areas where the mrx industry could step up. I’m not saying that all agencies are guilty of the above. Nor that all 10 points always happen at the same time. However, while it’s absolutely critical to keep innovating, we need to make sure that we also get the basics in place. As to your other questions, hopefully my future tweets/blog posts will answer them.

  6. @angry – who said pointing out bad practice is wrong? My comment doesn’t. I do feel that meeting what you view as one-sided selling from Agency providers (concur) with polecism isn’t to me at least particularly useful. The one says “try this, it’s fantastic”, and someone else says (Your blog) – “look here, this is terrible” – if I were an ethnographer I would say both viewpoints are extremely blinkered. Our industry – my view – is seriously perceptually on the back foot – we need to do more to promote our skill sets, the positive sides, and rather less self-laceration. Others bash us up enough anyway, and it’s pretty easy to find examples to support their case. I look forward to your future comments, blogs, and a debate that hopefully leads us all to a better positioning space.

  7. Plead guilty. Also it’s funny – I didn’t see the post as criticizing MR agencies alone, thought it included self-criticism too. That’s because we meet plenty of clients who define the consumer i.t.o dry demographics, who talk of them as if they are an extra-terrestrial species. Who among you has not winced at the words ‘consumer safari’ being used to describe immersions? We do a lot of home care product research in India and to hear marketing people talking about the housewife can be funny – ‘housewives love to scrub dirty toilets as they believe it is their duty’. Who likes that? And the classic one – ‘have a happy period’ – a researcher didn’t write that. Love your posts @angry_mr_client. Good solid advice that we are working to incorporate in our work. Are you up for some rebuttals too re client behavior? 🙂

  8. Thanks for this Angry MR Client. I lead a team of people at our organization who are focused on the care and well-being of our respondents (and as my colleague Greg pointed out are the lifeblood of our industry) and too often we have to remind our internal and external clients that these are fellow humans we are dealing with and must be treated with respect. I implore the client-side researchers out there to focus on #9 and give something back to the respondents – this is by far the most effective engagement strategy and is worth it’s weight in gold (or at least incentives saved)!

  9. Some of these are just weird. The only one I fully agree with is #4. Boring, overly complicated, and repetitive survey questions are a real bane and need to be done away with.

    But #2? What on earth? How many respondents even know they are referred to as “respondents”? And even if they did, why would they object to that when they know it is being done to protect their confidentiality? Also, what’s the difference between referring to them as “participants” vs. referring to them as “respondents”?

    What is the context here? Are you talking about referring to him as Andy while the research is going on, or when the report is written? Actually, it makes no sense in either context. I’ve never heard a moderator or interviewer refer to a guy named Andy as “respondent” during a group or interview (i.e., “Hey Respondent, what’s your opinion on that?”). And if you’re talking about the report, I would never, ever in a million years think of referring to a respodent by his or her name in a formal report. This complaint makes zero sense to me.

    #5 is an interesting point, but you offer no suggestions for how else it could or should be handled.

    #6 isn’t an issue of not caring about respondents, it’s an issue of bad screener design. Perhaps you are arguing these are one and the same.

    #9, do respondents ever actually make this complaint? I’ve been in this business for 30 years and have never heard that from anyone. I’ve heard respondents joke that if their ideas are used they want some of the profits, but this idea that all respondents are itching to find out what happens after the research is concluded is very weird. I can just imagine the kind of logistical and legal problems this would cause for client companies, too. “Hey Andy, glad we finally found you again, we’ve been looking for you for AGES. Just wanted to let you know we decided to use that cool idea you gave us.” “Great, when will my check be arriving? What, no check? You’ll hear from my lawyer in the morning.”

    #10 is not just an issue for the research supplier. If the ultimate client makes decisions on the product based on interpreting the research as though the respondents are robots, that’s on them, as well.

  10. Another interesting post by Leonard. I picked up some interesting points as how the respondents feel when we ask them questions we have asked before. On one side, this may come as a compulsory part of survey and the other one is that we may already have that information in the database.

    This can potentially spark the discussion on a unified approach to minimize the number of questions and time being spent on the survey.

    Another interesting point was about the LOI. I completely agree with this. There are several channels involved in the data collection and LOI / IR are two most overclocked 🙂 points at the time of finalizing costs and delivery timelines.

    Slipping from interesting to boring is one of the crucial factors in dropping response rate of panel. While the other industries are adopting latest methods to make survey more interesting, but majority of us still like to go old fashioned way of using grids, tons of radio buttons. Inclusion of latest web 2.0 based interactive controls can come handy as most of the respondents are used to looking at those.

    Now, here comes the most interesting part – THE FLOW. We tend to design the survey for easy data processing and often overlook the sequence of questions. The flow should be designed as a smooth transition from one question to another along with the point of respondent’s mindset while giving the interview.

    There are many other points which I would write separately. But again, very interesting article from Leonard.

  11. It’s OK, you can call me a consumer-digividual all you want, and describe me as a C2 white male 25-34. The only thing that annoys me is when there are too many compulsory questions, because the fewer clicks I can make for my £1 the better. After all, I”ve got surveys to complete for 16 other panel providers today

  12. Unfortunately this post is a mix of relevant complaints respondents can have, and those which I have trouble seeing the respondent caring much about. Yes, respondents can legitimately complain about answering the same question multiple times, or being asked to respond to boring grids and overly long surveys.

    But how many respondents know or care that they are classified as Upwardly Mobile Immigrants or Blue Jean Blues or whatever in some geodemographic clustering system? How many really expect Sony to call them back in four months and let them know what name they actually chose for the new product on which the respondent completed a survey? How many care if they are talked about “as a load of stats” in a corporate board room somewhere (as if they would even know that)?

    The research industry is too frequently doing a poor job at treating respondents well. We have enough tasks ahead of us without worrying about made-up problems such as the fact that “we don’t give back.” Inclusion of non-issues here really weakens a very essential argument: that research needs to treat respondents better if we expect people to continue being willing to be respondents.

  13. I believe that there is some truth in these, but agree that some might be more relevant than others. The main for me is the questionnaire length and flow, as well as asking boring & unnecessary grids etc. And this is more often than not requested by the client (“yes, we do need all these questions….”. Both MR agencies and clients have a responsibility here to our research participants!

  14. #5 (You ask me things I can’t remember) is my personal pet peeve. As a respondent, I have endured hotel chains asking me how many nights I traveled on business in the past 12 months, how many personal nights, and how many of each of them were spent at that particular hotel chain. Then, the follow up question is projected for each of these for the next 12 months. When you travel as much as I do, those questions cannot be answered reliably at this level of sensitivity!

    There really is no excuse for this. I am being invited through my participation in their loyalty programs — so they should already know my number of nights at their hotels! Regarding my number of nights at other hotels — there are more reliable ways to ask those questions. The same goes for the personal/business split. I stronly suspect the format of the questions is being driven by forecasting geeks who want ratio data. Sorry geeks — sometimes ratio data is just not possible. You may think you are getting ratio data, but recall the old adage: garbage in, garbage out.

    In addition to better questions, there are now better methodologies for avoiding questions a respondent cannot recall. Mobile and other methods now enable “in the moment” research.

  15. With the exception of a couple of points, which are valid for a discussion of best practices, this post is pretty ridiculous. I must say that I don’t really care what a respondent thinks of my asking questions about past behavior. We typically ask that they respond as best they can, and we realize that the reliability of memory is a strong bias on the data. That said, it can be useful for data modeling, so I’m going to ask it. Sometimes respondents do think that questionnaires ask questions that aren’t helpful, but they’re not the analysts… they don’t know how it will be used.

    Overall, this post seems to have one moral — we need to stop doing market research. It really inconveniences the subjects of the study. Or rather, we should only conduct research that the subjects will like, regardless of the kinds of marketing questions we can answer with it.

    I’m all for making the experience a more positive one for participants in market research, and lying about the length of the survey is just as unethical as lying to anyone in person. But I’m not going to discount the value of conjoint analysis simply because this fictional respondent doesn’t see the point.

    Further, I don’t think this post comes close to representing the opinions of research participants. If it did, then we wouldn’t have participants. In my experience, respondents are actually very interested in giving their opinion so we give them an open forum at the end of each survey to say anything else they would like. I always love reading those for additional thoughts on the topic that we may have missed, but I’m amazed by the number of positive comments that we receive on the design of the survey itself.

    Lastly, I appreciate how thought-provoking this blog post has been. I thought about it all day yesterday before deciding to comment on it. However, I think that it is actually dangerous for us as an industry to take these comments seriously. Rather, we should use it as a stern reminder to review best practices for data quality, reduced bias, and ethics in research.

    1. @Dale, your last point was why AMRC wrote this; to remind us all that we do need to be the guardians of the consumer experience when interacting with them, and the conveyors of human understanding during our research. Although some of the points should be taken literally, I’m pretty sure most were intentionally inflammatory simply to rattle us out of our complacent group think that may limit our ability to fulfill those two roles.

  16. I’d be interested to know how prevalent Angry M.R thinks this lack of empathy for the respondent is and in which area(s) of research? I don’t really see these issues in the qualitative research side at all. Perhaps Angry M.R. person referring more so to Quantitative Researchers who need to number crunch “people” and their opinions? As a qualitative researcher, I consider my self to be the “Voice of the Consumer,” and a liaison between the “respondent” and the client: I “protect” my participants; I form mini relationships which each and every person that I interview. I love interviewing people, listening to THEIR individual story and hearing their concerns and success alike! I have interviewed thousands of people – from all walks of life. I oftentimes have respondents HUG me at the end of a focus group or interview!

    Two important things that I have learned about people, are: (1) Don’t judge a book by it’s cover (i.e., don’t stereo type – you will always be surprised); and, (2) Never assume.

    Lastly, I know that there are a lot of researchers out there who are just like me, who really do care and whose “respondents” DON”T hate them! Perhaps I should write a “10 things respondents LOVE about Market Research” 🙂

  17. Susan, as a field researcher who has 5 offices in 4 major US cities and sees tens of thousands of research participants annually, I unfortunately have to state that this blog is completely accurate overall for the qualitative research industry. Cooperation rates are dwindling in part because we as an industry do not value/show appreciation for consumer’s time, do not respect them and in many ways do not treat them as partners in the process.

    20 minute plus screeners with algorithms and lengthy interviews that often result in “sorry you don’t qualify, hopefully next time” (the table example above is something we see daily); add on work after screening because someone in creative decided a home work assignment is now necessary (and hey, we’ll just give them an extra $15 for the extra half hour of time, that’s fair…); odd times of research because the research team doesn’t want to speak with participants when its convenient for the participant; re-screening participants and then going into full interrogation mode (which I’ve seen border on harassment) if they changed from liking a product a lot to a little (despite not asking, or caring, what’s happened in the past two weeks that may have changed that attitude…or perhaps probing further to realize that the participant simply may have been fatigued after 20 minutes of questions on the phone), and then when its over…never sharing with participants how the research impacted the product/service, and what the company planned to do about it moving forward. These all happen regularly…and need to change. We are killing the golden goose!

    Unlike most, I’ve actually done research with participants…on the research experience itself (i.e. what was liked/not liked, and how it could be made better). The findings:

    1. Frustration with long screeners/questionnaires that result in “thanks…no.”
    2. Desiring better communication: if they don’t qualify, tell them, but don’t put them on hold for days/weeks! If we answer questions online, tell us what are the next steps!
    3. Most frequently used words by participants: trust and professionalism. They agreed to the research because they trust the professionalism of our company, and that they are treated well (number 1 reason people sign up: referral from someone who did it and spoke positively about it). For passionate consumers…money is the “thank you,” the topic of the research (i.e. the product or service they are passionate about) and the interest to see that improve for their needs is why they really want to be there, and then once there, they appreciate being treated courteously, professionally, and together, the research is a collaboration of minds to make what they are passionate about…better.
    4. “I don’t know what came from the research…I’d really like to! I didn’t come for $100…I came because I love that product and want to know if you’re going to change it!” Money being #1 is the small minority…most, and the ones clients rave about the most (newbies typically)…it was that someone cared enough to ask them their opinion, and they appreciate that. Further, they would really like to know…what will happen next after the research?

    I’ve said this for many years, said it in every forum I can find: declining cooperation rates are the biggest threat to our industry. Technology is something we’ve invested 7 figures in now to make recruiting better, and we’re trying to improve the process and reduce participant fatigue, as well as invest in the communities we have offices, to engage them and see more people engage us (and thus sign up to participate), but if we continue to turn off consumers, then we destroy our industry.

    At the forefront, we need better marketing/education of our industry: the vast majority of society does not know about qualitative research. Educating them on how it works, the value it brings and why their participation is crucial has got to be our number one priority. Then…screening needs to get better/shorter, we must identify better ways to find the right people beyond lengthy screeners that are self defeating. Then…ensure during the research that they are treated as the customers they are…respected, trusted, with professionalism and care…and most importantly, appreciation for their time. Then finally…share with them what was learned, and what you intend to do with the data you gathered/what you learned. In our social media world, think how positive that would be, and how viral brands could become, simply by the MR department telling customers ‘”thank you for your time, we’re listening, and here’s what we’re going to do to make the product you love better.” We talk of ROI in MR…that’s ROI! In my years of customer satisfaction research, this was always the #1 warning to clients: if you collect the data but don’t use it, customers will not only stop giving you the data, their satisfaction will decline, as they told you what was wrong, and you didn’t do anything about it. Customer sat is a holistic, circular process…and it includes sharing with your customers what you intend to do to make your services better, to meet their needs. All MR needs to be more this way…I strongly believe this drop in cooperation is in part due to this lack of holistic approach, and remembering we need these people, they do NOT need us.

    I am very glass half full person…and there’s positive examples I could share without question of our industry…but unfortunately this blog’s stereotype is more accurate, then not. It is not my intent to lay the cross upon QRC’s and end clients…field researchers have their own crosses to bear on this subject with years of neglect and shoddy processes that has in part fueled some of these outcomes…its high time all three sides come together to address these issues, and create solutions to eliminate these problems, so we reverse the cooperation rate percentage.

Join the conversation