The 7 Deadly Survey Questions

Bad survey questions can easily muddy your data and derail your business decisions. Here are 7 examples of questions to avoid in your next survey.


Jana Barrett

January 2, 2020

Back to Resources

Have you ever sent a survey and realized, after the fact, that your questions were confusing or misleading? It happens to everyone at some point. And while bad survey questions aren’t the end of the world, they can easily muddy your data and derail your business decisions.

Consequences of bad survey questions 

Having poorly written or confusing questions in your survey can cause many issues. 

If your questions are written in a misleading way, or a way that leads to misunderstandings, your survey will receive biased results and inaccurate data. Survey respondents may also be hesitant to answer questions they don’t fully understand. And further, poor survey questions can force respondents to respond inaccurately to a question through no fault of their own. 

Avoiding these negative consequences is essential if you want to create a survey that is easy for respondents to answer and offers you accurate data. 

Let’s take a look at the most common pitfalls and how you can avoid them.

Examples of Bad Survey Questions

1. The Leading Question

Leading questions are those that use biased language. This language influences the survey taker’s selection. The problem with a leading question is that it can seem innocuous, but actually be fishing for a certain answer.

It’s not only leading questions that you have to watch out for either, but also the imbalanced response options. If you populate the response options with bias, you’re not going to get the data you need.

When you use leading questions, you’re not going to get actionable, accurate data that can help guide business decisions. Instead, you may be encouraging customers to answer in a certain way.

How to identify leading questions 

Leading questions often use subjective adjectives, such as “great” or “hard-working”. They also may frame the question in a way that’s positive or negative, instead of neutral. 

Examples of leading questions:

  • Did you enjoy our amazing new product offering?

Try phrasing the question as: “How would you describe your experience with our new product offering?” 

  • Are you excited about what we’ll do next?

Try phrasing this question as: “How do you feel about our future potential?”

The first question uses the subjective adjective “amazing” to lead the respondent to a positive answer. The respondent may not share this judgement, and may feel uncomfortable answering and abandon the survey. 

The second question assumes that the respondent is in an excited emotional state about the future of the company, which is likely inaccurate. The respondent, if they’re not excited about what the company will do next, could get confused or irritated and stop answering questions. 

The alternative questions are more objective and they don’t assume a respondent feels a certain way. Instead, they ask how a respondent feels in a neutral way to get a more accurate answer. 

Leading questions can lead to a higher survey drop-off rate because respondents feel confused about how to answer accurately if they don’t agree with your assumptions. The answers you do receive could be severely biased and inaccurate. Leading questions can even lead to a negative perception of your company if the respondent feels the questions are trying to manipulate them into providing a positive answer. 

What to do instead: Make sure that questions are clear and simple—and refrain from using adjectives like “amazing” that are highly subjective and likely to influence responses.

Get your customer experience (CX) program up and running in days, not months

Find out which plan is right for you.

2. The Assumptive Question

Assumptive questions makes assumptions about what the survey-taker knows and feels without taking a step back and considering where the survey-taker actually stands. For example, an assumptive question might ask someone which email marketing software they use, even though many of the survey-takers don’t use email marketing software at all.

Assumptive questions leave out essential information that is necessary to understand the survey-taker. They are similar to leading questions in that they inadvertently encourage survey-takers to respond in a certain way.

How to identify assumptive questions 

Identifying assumptive questions can be difficult, as the person writing the survey tends to have a high level of knowledge about the topic that they don’t realize respondents may lack. It can help to read through the entire survey to see if the context for the assumption you’re making comes from a previous question, or a customer database. 

You can also ask someone who is not an SME on the topic to read through your survey before you publish it to check that your questions are not making inaccurate assumptions. 

Examples of assumptive questions:

  • When you drink scotch, do you like it on the rocks?

Try phrasing this question as: “If you drink scotch, do you drink it on the rocks?” and include a response option that indicates they don’t drink scotch 

  • Do you go to the park when you’re stressed at work?

Try phrasing this question as: “If you feel stressed at work, do you go to the park?” and include a response option that indicates they don’t experience stress at work 

The assumptive questions make assumptions about the respondents that are not backed up in fact - that they drink scotch or experience stress at work. You may assume that most people have these experiences, but that might not be based in truth. And questions using those untested assumptions can make it difficult (or impossible) for respondents to answer accurately. 

Rephrasing questions can be helpful, but there’s another way to avoid asking assumptive questions - use your customer data. If you’re really only curious about the habits of scotch drinkers, use the data you have on hand to determine who is a scotch drinker and only send your survey to them. 

You can also use skip logic in your surveys to avoid making assumptions. For the scotch question above, for example, you could begin with a question asking the respondent if they drink scotch. If they respond no, you can set up your survey to skip the “on the rocks” question so the only people answering that one are scotch drinkers. 

The answer options in your survey should also allow people to opt out of questions that don’t apply to them in order to ensure the integrity of your survey data. This can mean providing an “other” or “does not apply” option in questions that might not apply to everyone taking your survey. 

What to do instead: Don’t assume anything about your survey-taker. You don’t know if they drink alcohol, work, or are a native English speaker. Be sure to create big picture questions that set context.

3. The Pushy Question

A pushy question forces survey-takers to make a choice and pushes respondents toward particular favored answers. Usually these questions require a respondent to choose from a list with two few categories, preventing them from answering accurately.

For example, if you ask survey-takers “Ice cream is good on a cold day” and offer only the ability to “agree” or “disagree” you force respondents to answer, whether or not their answer affects their true feelings.

How to identify pushy questions

Examples of pushy questions:

  • Ice cream is good on a cold day. Agree or disagree?

Try phrasing this question as: “How do you feel about eating ice cream on a cold day?”

  • When choosing between software, what do you look for?

Try phrasing this question as: “What features are important to you when choosing a software product?”

These pushy questions are bad because, to survey respondents, they can feel as though they’re being pushed towards an answer - ice cream is great on a cold day, or whatever software options you have are the best ones. 

To business owners, these questions might not feel pushy - of course you’re excited about your software product, and perhaps you love ice cream on a cold day. But to get accurate data from your survey, you need to phrase questions as neutrally as possible. The goal is not to get people to agree with you - it’s to gather accurate, actionable data from your survey. 

What to do instead: As much as possible, try to put yourself in your respondent’s shoes. Consider whether the question is open-ended, or may push the survey-taker in a certain, favorable direction.

4. The Confusing Question

Without meaning to, you may be creating questions that are confusing to survey-takers. These confusing questions may be poorly worded questions and/or responses, be illogically formatted, or be the wrong question type for the matter at hand.

How to identify confusing questions 


  • Do you think our sales team was not unhelpful, or were they helpful?

Try rephrasing this question as: “How helpful did you find our sales team?” 

  • True or false?: Your sales consultant was not equipped for the job.

Try rephrasing this question as: “True or false: Your sales consultant was equipped for the job.”

The first example question is worded in a way that’s difficult to answer accurately. If your respondent answers yes, are they saying your sales team was actually helpful, or just not actively unhelpful? That answer is not going to give you good data. 

The second example question is also worded in a confusing way - it’s giving a double negative as an option, which is sure to confuse most respondents. People answering your surveys are busy and skimming your content - so keep that in mind when writing your questions.

To prevent double negatives from slipping into your survey questions, check for any instances of “no” or “not” paired with the un- prefix, negative adverbs, or exceptions like unless or except. Read your survey questions and answer options out loud to be sure you catch any of these sneaky mistakes - they can happen when you’re in a hurry to launch your survey. 

What to do instead: Avoid double negatives at all costs. In general, using negatives can be confusing. Try to use clear, concise statements and questions to get your meaning across.

5. The Random Question

Random questions are out-of-context and don’t focus on topics that are important or relevant to the survey-taker. Usually, random questions are clearly driven by the survey maker’s interests.

These questions might slip into your survey as you decide that while you’re surveying customers on one topic, might as well slip in a question or two on another topic you’re curious about - right? 

Wrong. You might know the logic behind the random question, but your respondents won’t, and they’ll become confused and possibly even abandon the survey. 

How to identify random questions

When you’ve finished writing your survey, read it carefully to ensure all the questions cover the same topics - save any tangentially related ones for a separate survey. You should also have someone who is not involved in the survey process take a look to be sure all the questions clearly relate to each other. 


  • Do you consider yourself physically fit? 

If you’re surveying people about their scotch drinking habits and slip this question in, they will wonder what physical fitness has to do with how they take their scotch. And rightfully so - those topics are unrelated in any obvious way. 

How much would you spend on a water bottle?

When your survey is focused on a customer experience and you decide to pop in a market research question, customers and respondents will question why the topic of the survey shifted. 

What to do instead: Make sure that all questions you ask are relevant to the survey and its goals.

6. The Double-Barreled Question

Double-barreled questions squeeze too much into one question, making it difficult for a survey-taker to answer accurately. Sometimes, these double-barreled questions ask respondents to rate/rank two or more things in one question, or combine two different ideas into one question.

How to identify double-barreled questions 

This is one of the most common survey question mistakes. You should always write your questions in a way that you’re only measuring one thing at a time. 

Examples of double-barreled questions:

  • How long did it take you to complete the process and on what day of the week did you do it?

Try rephrasing this question into two questions: “How long did it take you to complete the process?” and “On What day of the week did you complete the process?” 

  • Agree or disagree: The onboarding was easy to understand and very comprehensive.

Try rephrasing this question into two questions: “Agree or disagree? The onboarding was easy to understand.” and “Agree or disagree? The onboarding was comprehensive.” 

Double-barreled questions are confusing to your survey respondents. If they thought your onboarding process was easy to understand but not very comprehensive, how will they know whether to agree or disagree with your survey question? And how will you know what they really thought about each component?

Well, they won’t know - and neither will you. 

Fixing double-barreled questions is a matter of splitting them into two separate questions. This will ensure your respondents know clearly what you’re asking in each one. And it ensures your survey data is more accurate, because you’ll know exactly what respondents were agreeing or disagreeing with. 

Don’t try to cram too much into one survey question - ask only about the priorities you’re concerned with in the survey. If your survey is too long, remove questions that don’t align to your goals instead of jamming questions together. 

What to do instead: Make sure your questions are asking for one answer to one main idea.

7. The Ambiguous Question

Ambiguous questions are far too broad. The questions and/or responses leave room for interpretation, which leads survey-takers to guess or default to whatever answer makes the most sense.

How to identify ambiguous questions 

Examples of ambiguous questions:

  • Do you think your friends would like our jackets?

Try rephrasing this question as: “Would you recommend our jackets to your friends?” 

  • Are we better than other software companies?

Try rephrasing this question as: “Would you recommend our software products to your friends and colleagues?” 

These ambiguous question examples are confusing for respondents to answer because they lack specificity. For example, your respondents likely have many friends, and they have different styles and preferences - asking if all of them would like your jackets is too vague. 

The word “better” in the second question has the same problem. Software companies and products have many elements - their price points, features, customer support, user interface, and more. Not to mention, there are lots of software companies in a huge variety of industries. It’s not possible to say if your software company is “better” than all others.

Always ensure your survey questions are written as concisely as possible to minimize possible confusion. Respondents should know exactly what you’re asking the first time they read the question. 

What to do instead: Consider adding an “other” field to multiple choice survey questions to capture survey-takers true feelings. Yes, the feedback will be qualitative, but it will be more accurate.


When it comes to constructing a survey, there’s a lot to consider. But, if you don’t have questions that are easy to understand and make sense to the survey-taker, you won’t be able to gain information that can impact your business. To improve customer experience, you need accurate survey responses that lead you to take action.

Looking for a customer experience platform that will help you to craft effective surveys and analyze the results with accuracy? Try GetFeedback! And review our extensive resources to learn more about creating the perfect survey for your needs. 

Subscribe for the latest CX content

Privacy notice|California privacy notice
Terms of use
|Cookie policy

*Net Promoter, Net Promoter System, Net Promoter Score, NPS and the NPS-related emoticons are registered trademarks of Bain & Company, Inc., Fred Reichheld and Satmetrix Systems, Inc.