InterrogationHow often do you poll your current clients or customers on why they chose you? How good of a grasp do you have on what your ideal customers want? Market research is a broad and vital part of business that too many people ignore because:

  • They’re afraid of what they might find (though you’ll have a hard time getting them to admit this)
  • They just don’t have the time
  • They don’t know how to ask

Not a good reason in the bunch. Think of it this way:

Everything you do in business can only be as efficient as your knowledge of the market. If you know your customers 70%, everything you do is maximum 70% efficient.

If you don't spend time knowing your audience, your wasting a lot more time elsewhere. Do the research - end of story. Now, it’s that last question of "how to ask" that we want to address here.

As part of the board of the American Marketing Association here in Portland, I helped organize a professional development event a couple weeks ago with speaker Darrin Helsel of Distill Research. Darrin gave a great interactive presentation on how to ask the right questions in the right way, and I wanted to share three of his points around survey questions here stood out most to me.

Don’t Confuse Your Respondents

We have all been frustrated by a survey at one time or another, and we know the obvious consequence: we abandon the survey. I think I’m especially susceptible – I abandon surveys with gusto. I probably have an abandoned-to-completed ratio of 5:1.

But there is another more insidious and evil result: the respondent is confused enough to be frustrated, but for whatever reason is obligated to complete the survey  (it’s required of them, there’s a prize offer at the end, wanting to finish what they started, etc.). This is what makes people say “Screw it, I’m just gonna fill in the rest of the questions as quickly as possible and be done!” If this happens on your survey, it’s worse than abandonment: it corrupts all data you have and it’s much harder to detect.

To avoid this, make sure that each question you ask has an appropriate answer for everyone taking the survey. People get frustrated when there is not an option that they feel represents them.

To illustrate this, take a look at the image below. During his presentation, Darrin gave the same survey twice – the first time breaking some of his rules, the second time fixing the issues. The audience responded in real time using Dialsmith technology, which allowed Darrin to show us the results of the survey immediately, and that is what you are seeing here.

question 1 slide

You can see the questions in the graphic, but you can’t see the choices. They were:

  1. Employed
  2. Employed part-time
  3. Unemployed
  4. A student
  5. A stay-at-home parent
  6. Retired
  7. None of the above

Since respondents could only choose one answer, it was a very frustrating question. Clearly someone can be a student and unemployed, or employed part-time and a stay-at-home parent, etc. This shows in the data: The original (confusing) question is on the right and has nearly 7% “None of the above.” Really? There were 7% of the people that were neither employed nor unemployed? Obviously, this poorly formed question has returned bad data.

Clarification for a question like this can be done in three ways:

  • Make the answers mutually exclusive, so that there is no possibility of a person fitting two or more choices
  • Allow respondents to select multiple answers
  • Give the respondents instructions on what to do in case of a conflict

Darrin went with the third option, and you can see how effective it was. If you look at the data on the right – showing answers from the same people but for the corrected questions – there are no “none of the above” answers. Now that is clear, uncorrupted information!

Collect Rich Data

Even if you are getting accurate, uncorrupted data, you want to make sure you are getting the whole story.  The example question used here was:

Which activity do you perform on your mobile phone most often?

  1. Placing phone calls
  2. Browsing the internet
  3. Using apps
  4. Listening to music
  5. Texting
  6. Playing games
  7. Sending emails
  8. Internet shopping

That’s the original, and here was the data:

question 2.1

There’s nothing wrong with the question itself, but if that’s the only question asked, you are missing a big chunk of the picture. Just because someone spends most of their time texting doesn’t mean they don’t spend a ton of time browsing the internet, and if you are trying to get a picture of the way mobile phones are used that’s important info.

Darrin’s solution in this case is to repeat the original, and also ask about the next two layers. Here are the results:

question results 2.2

question response 2.3

 

You can see that in the first instance the question was exactly the same, and the data barely changed. That’s good, it shows reproducibility. More interestingly, look at the next two questions about the second most popular use of phones:

  • Phone calls jumped from 14% to 34% compared to the first question
  • Texting stayed the same, around 33%,
  • Emails dropped from 25% to 7%

Doesn’t that paint a much richer picture than just the first question? If you compare the third question in the same way, you have another layer and more detail. Of course, you don’t want to drag the survey on too much, but these questions don’t require much more brainpower after someone has thought through their phone use.

Provide Context to Questions

While the two rules above may seem somewhat intuitive, at least after having them explained, this next one is not. I’ve seen this done the right way and the wrong way, and I’d say the mistake is probably more common than the right way. This time I’ll give you both questions at once; see if you can guess the right one:

  1. When you purchase new digital music for your mobile phone, how many songs do you purchase on average?
  2. Over the past 14 days, how many songs have you purchased from your mobile phone?

The problem is, thinking back to average something out that you do on a semi-regular basis is very difficult. First of all, your memory is probably not all that accurate, and second, you might not be that great at averaging over a long period anyway. With A, you’re not likely to get good results.

Question B, however, is very clear. It’s a definite time frame, and you're being asked for a direct count – no calculation is involved. Check out the difference:

question 3.1

question response 3.2

Find a way to apply these principles - NOW

You’ve learned, now apply it! This isn’t just about how to do market research, it’s about actually doing market research. I challenge you to think of some things you’d like to find out regarding your business or yourself and structure a survey around them using the rules above. Then, go out and collect your data, regardless of what your sample size is. It’s a great way to apply something you’ve read and have it really take hold, rather than clicking over to the next article.

If you’d like more tips, contact Darrin at Distill Research for his whitepaper including all 10. If you’d like to read more articles like this, follow us on LinkedIn, Twitter, or Google+.

If you have any experience running surveys yourself, I’d love to hear about them.