Response bias is a problem for anyone undertaking market research. Let’s face it—when you conduct any market research or customer surveys you want to get truthful answers from your respondents.
When your respondents bend the truth, they skew the results of the survey. The insights that you get from your survey don’t provide you with the information you need. At best this is a minor inconvenience. At worst, it can jeopardize your business resulting in you either losing customers or investing capital in a business plan that is destined to fail.
While response bias is never going to be completely avoidable, there are ways of minimizing the risk. In this article, I’ll walk you through what is response bias, and provide you examples of the different kinds of response, before showing you how to mitigate the risks.
What is response bias?
Response bias is when a bias affects the response you get from a person. For example, if you were running a cafe’, you might ask your client: did you enjoy the coffee?
There is a good chance that the person responding, especially if they are British (I would know—being British myself), will say it was tasty even if it wasn’t. This little white lie, to avoid confrontation, is response bias.
Response bias impacts the quality of survey results. Continuing with the example, if the cafe owner doesn’t know that their coffee doesn’t taste good, the cafe’ will get fewer return customers, without knowing what the customers really think. In this example, response bias impacts business operations.
Common types of response bias
To minimize the risk of response bias, you need to understand why bias occurs. There are seven types of response bias that commonly impact survey results.
We’ll quickly review each of these response bias below.
Demand bias is when a respondent changes their opinions and behavior as a result of taking part in a study or survey. Common causes of demand bias include:
- Advance knowledge of the survey: If a person has prior knowledge of the survey, they might research the topic and prepare answers.
- Assumptions about the purpose: Thinking that you understand the purpose of the survey can result in the respondent giving you the answer they think you want to hear.
- Interaction between researcher & respondent: Respondents are likely to provide a more critical response to a third party company.
- Anonymous versus non-anonymous: The answer a person gives could be different for an anonymous survey versus a non-anonymous survey.
Demand bias is impossible to avoid. You can reduce the impacts of demand bias by taking into account some of the points mentioned above.
Social desirability bias
Social desirability bias is when a respondent answers your survey question with the answer they think is socially or morally correct. Let’s take an example where most people are likely to respond the same way; “Should you be punished if you break the law?”
If this were a yes or no question, you would expect the majority of people to choose yes.
There are times when you need to ask a question where social desirability bias will be an issue. For example, if you were trying to develop a way to increase revenue from your site, you might want to find out how many people buy backlinks for their websites.
Instead of directly asking if the person buys backlinks, you could ask: “Do you know people who buy backlinks?”
By framing the question differently, you are more likely to get better results from your research.
Acquiescence bias & dissent bias
Acquiescence bias is where the person answering your survey selects yes, or responds positively to your questions. The opposite of acquiescence bias is dissent bias. This is where a person chooses no or answers all your questions negatively.
Acquiescence bias and dissent bias is likely to be an issue if the majority of your survey questions are multiple-choice. Rather obviously, the best way to avoid this response bias is to use a mixture of multiple-choice and open-ended questions.
Neutral response & extreme response
The Likert Scale is a type of multiple-choice question. You might have five answers to choose from for a question. At one end you could have Strongly agree. At the other, you might have Strongly disagree.
Of course, it doesn’t need to be words, you could have a star rating scale.
A person consistently choosing the option at either end might have an extreme response bias. Education, culture, level of education, and the wording of a question have all been shown to impact extreme response bias.
One of the most frequent causes of extreme responses is the use of emotive language in a question. More on the impact emotive language can have on respondent answers later.
The opposite of extreme response is where a respondent selects a neutral answer every time. This often happens when a person isn’t interested in the survey. I’m sure you’ll have seen one or two examples of this if you are a teacher.
We all have biases and they impact the way we interact with the world. Personal bias is something that you will have to consider when choosing how to word your questions and analyzing the responses.
Let me give you a classic example of how personal bias can impact survey response.
In 1980 a Swedish psychologist asked 81 Americans to rate their driving ability among their peers. Of the 81 people who completed the survey, 93% of respondents said they were better than average. This is statistically improbable.
Just like the other response biases in this list, it is impossible to avoid the impacts of personal biases from your respondents. However, you need to consider how to account for it both in the type of questions you ask and the people you ask them to.
General tips for avoiding response bias
In the previous section, we covered what a response bias is and the different types of response biases you are likely to get. In this section, we will cover how to phrase your survey questions.
One of the most common causes of response bias is a vaguely worded question. I’ll provide you some examples, both good and bad, from real surveys using open-ended questions, yes and no questions and multiple-choice questions.
How to frame open-ended questions
Open-ended survey questions are the easiest to mess up. With an open question, a respondent can say pretty much anything. Unfortunately, they often do when the question is vaguely worded.
An excellent way to avoid response bias when asking an open-ended question is to start with a prompt. You can then use a couple of linguistic tricks to ensure you’re getting a truthful answer.
You can do this by objectively asking questions. The example below from a fitness brand is a good example.
The question above avoids a leading answer by focusing on the experiences of the respondent.
Another effective method for posing an open-ended question is a story prompt. In this example, that could look something like, “In my opinion, the greatest benefits of the classes I attend are…”
Don’t make the mistake of asking a leading question.
For example, “what makes our product/employees/service so awesome?” A question posed in this way is likely to generate poor quality data complete with response bias.
How to frame yes/no questions
The most common type of closed-question is yes or no. Despite the fact the respondent only has two choices, things can still go wrong.
When things go wrong, there is usually an issue with clarity.
If you ask a yes/no question in an unclear or ambiguous way, the truthfulness of your responses will suffer. Let me give you an example.
Recently means different things to different people.
Ask ten people who visited Dublin Airport six months ago this question, and I guarantee some will answer yes and others will say no. The use of the word recently will negatively affect the quality of your survey results.
A better strategy would be to ditch ‘recently’ and ask about a specific timeframe, for example: “Have you flown through Dublin Airport in the last six months?”
How to frame Likert Scale questions
All of the points I raised in this article apply to multiple-choice questions. However, there is one final point I’d like to cover before wrapping up this guide.
A double-barreled question is when a survey writer combines two unrelated issues into one. This often happens when the person writing the survey tries to clarify their questions in the way you would pose follow-ups in a normal conversation.
This example from a DIY brand nicely illustrates my point.
The two questions are perfectly valid when used independently.
However, anyone reading this question will struggle to understand what answer they should provide. As a result, the data generated from this question lacks value.
Wrapping things up
In this guide to how to get truthful answers to your survey questions, I’ve tried to cover everything you need to know about survey bias. In the first portion, I started by defining response bias and illustrating the different types of response bias you are likely to encounter.
The second half looked at how to pose your survey questions. We discussed the different types of survey questions, and also highlighted the kinds of problems people frequently encounter.
Hopefully, you now have a better idea of how to pose questions that run a successful survey that minimizes the impact of response bias.
Editor’s note: This article reflects the personal opinions of our guest author.
Learn how GetFeedback can help you exceed customers’ expectations—start your free trial today.
About the guest author
Nico Prins is an online marketer and the founder of Launch Space. He’s worked with everyone from Fortune 500 companies to startups helping them develop content marketing strategies that align with their business goals. You can contact him at firstname.lastname@example.org.