Have you ever sent a survey and realized, after the fact, that your questions were confusing or misleading? It happens to everyone at some point. And while bad survey questions aren’t the end of the world, they can easily muddy your data and derail your business decisions.

Let’s take a look at the most common pitfalls and how you can avoid them.

Examples of Bad Survey Questions

1. The Leading Question

Leading questions are those that use biased language. This language influences the survey taker’s selection. The problem with a leading question is that it can seem innocuous, but actually be fishing for a certain answer.

It’s not only leading questions that you have to watch out for either, but also the imbalanced response options. If you populate the response options with bias, you’re not going to get the data you need.

When you use leading questions, you’re not going to get actionable, accurate data that can help guide business decisions. Instead, you may be encouraging customers to answer in a certain way.

Examples of leading questions:

  • Did you enjoy our amazing new product offering?
  • Are you excited about what we’ll do next?

What to do instead: Make sure that questions are clear and simple—and refrain from using adjectives like “amazing” that are highly subjective and likely to influence responses.

2. The Assumptive Question

Assumptive questions makes assumptions about what the survey-taker knows and feels without taking a step back and considering where the survey-taker actually stands. For example, an assumptive question might ask someone which email marketing software they use, even though many of the survey-takers don’t use email marketing software at all.

Assumptive questions leave out essential information that is necessary to understand the survey-taker. They are similar to leading questions in that they inadvertently encourage survey-takers to respond in a certain way.

Examples of assumptive questions:

  • When you drink scotch, do you like it on the rocks?
  • Do you go to the park when you’re stressed at work?

What to do instead: Don’t assume anything about your survey-taker. You don’t know if they drink alcohol, work out, or are a native English speaker. Be sure to create big picture questions that set context.

3. The Pushy Question

A pushy question forces survey-takers to make a choice and pushes respondents toward particular favored answers. Usually these questions require a respondent to choose from a list with two few categories, preventing them from answering accurately.

For example, if you ask survey-takers “Ice cream is good on a cold day” and offer only the ability to “agree” or “disagree” you force respondents to answer, whether or not their answer affects their true feelings.

Examples of pushy question:

  • Ice cream is good on a cold day. Agree or disagree?
  • When choosing between software, what do you look for?

What to do instead: As much as possible, try to put yourself in your respondent’s shoes. Consider whether the question is open-ended, or may push the survey-taker in a certain, favorable direction.

4. The Confusing Question

Without meaning to, you may be creating questions that are confusing to survey-takers. These confusing questions may be poorly worded questions and/or responses, be illogically formatted, or be the wrong question type for the matter at hand.

Examples:

  • Do you think our sales team was not unhelpful, or were they helpful?
  • True or false?: Your sales consultant was not equipped for the job.

What to do instead: Avoid double negatives at all costs. In general, using negatives can be confusing. Try to use clear, concise statements and questions to get your meaning across.

5. The Random Question

Random questions are out-of-context and don’t focus on topics that are important or relevant to the survey-taker. Usually, random questions are clearly driven by the survey maker’s interests.

Examples:

  • Do you consider yourself physically fit?
  • How much would you spend on a water bottle?

What to do instead: Make sure that all questions you ask are relevant to the survey and its goals?

6. The Double-Barreled Question

Double-barreled questions squeeze too much into one question, making it difficult for a survey-taker to answer accurately. Sometimes, these double-barreled questions ask respondents to rate/rank two or more things in one question, or combine two different ideas into one question.

Examples of double-barreled questions:

  • How long did it take you to complete the process and on what day of the week did you do it?
  • Agree or disagree?: The onboarding was easy to understand and very comprehensive.

What to do instead: Make sure your questions are asking for one answer to one main idea.

7. The Ambiguous Question

Ambiguous questions are far too broad. The questions and/or responses leave room for interpretation, which leads survey-takers to guess or default to whatever answer makes the most sense.

Examples of ambiguous questions:

  •  Do you think your friends would like our jackets?
  •  Are we better than other software companies?

What to do instead: Consider adding an “other” field to multiple choice survey questions to capture survey-takers true feelings. Yes, the feedback will be qualitative, but it will be more accurate.

Wrap-Up

When it comes to constructing a survey, there’s a lot to consider. But, if you don’t have questions that are easy to understand and make sense to the survey-taker, you won’t be able to gain information that can impact your business. To improve customer experience, you need accurate survey responses that lead you to take action.

Try GetFeedback for free

Ready to build a better customer experience?
Subscribe for a weekly dose of customer experience insights—straight to your inbox.
×