If you're collecting information about the attitudes, reactions, or opinions of something that you're researching, it's essential to make sure that you construct a survey in a way that will obtain accurate information.
A carefully constructed survey will result in consistent answers that are replicable. However, whether researchers make survey question errors on purpose, to skew results, or out of ignorance, the reliability and validity of the research will be questioned.
So when you're constructing a survey, it's important to make sure that you avoid making common survey question errors like including double-barreled questions.
Learn what double-barreled questions are, how to prevent them, and what other survey question pitfalls you can encounter when constructing your survey.
Analyze your survey responses faster and surface more actionable insights in Dovetail
A double-barreled question asks two questions at once. Usually, the question is one but combines two issues with a conjunction. However, the question only requires one answer, leaving the respondent needing clarification about how to answer the question.
In addition, the respondent may agree with the first part of the question but not the second. As a result, the researcher will need clarification about the respondent's attitude, reaction, or opinion and will need to know how to factor the answer into the research. Ultimately, the researcher will not know which question the respondent’s answer applies to.
This type of question can also be called a double-direct or double-ended question. These terms describe a question that includes more than one topic and asks about two different issues while only allowing one answer.
A compound question is usually a legal term, but they are constructed like double-barreled questions and can confuse anyone answering them. So compound questions can also be called double-barreled questions because of the confusion they create.
Leading questions is another survey question error that researchers make. However, although both leading and double-barreled questions confuse respondents and make research results ambiguous, they are constructed differently.
Leading questions are designed to nudge the respondent into giving a particular answer. These questions are usually constructed with limited choices. A leading question can bias the respondent's thoughts by inferring a correct answer or using suggestive language.
The effect of double-barreled questions is that the respondent may feel like they are being tricked into agreeing with something when they don't. Because of this, respondents may drop out of the survey process, meaning they won't continue with the survey because of this adverse impression.
Furthermore, even if they continue participating in the survey, they may agree with part of the question but not the other, which will skew the survey results.
Two double-barreled question examples may help you see why this type of question is undesirable for the respondents or research results. Changing these examples to questions that will be easy to answer will get you the results you need.
Do you see how this might be conflicting for the respondent? For example, they may think your software is excellent, but your customer support might be up to a different standard. Keeping these issues separate will bring more precise results.
This question would confuse the respondent about what information you hope to gather. For example, they could readily rate their remote work conditions but would need help understanding what the manager's cooperation has to do with their conditions.
Sometimes you don't realize when you put double-barreled questions into the survey. So you need a way to make sure your survey does not contain any that you may have mistakenly included in your survey.
Go back and read every question carefully. Sometimes it's helpful for someone with fresh eyes to look over the questions too. Then, if you miss them after a second or third read-through, they will undoubtedly notice them.
It's best to test your survey on a small number of respondents from your sample population. This will help you evaluate the reliability and validity that your survey answers will bring to your research. You can then collect data on how effective your survey will be by:
Interviewing the respondents
Including evaluation questions at the end of a pilot survey
Having experts evaluate your survey
Analysis of response data
You can then edit your survey according to the feedback from your test respondents, focus groups, and experts, getting the best results for your efforts.
Double-barreled questions are only some of the pitfalls a researcher can experience with survey questions. Many different types of questions can skew the research results because of confusion and bias.
In addition, if questions need to be clarified or more specific, there may be a high dropout rate among respondents because it becomes too complicated. Knowing these pitfalls and how to avoid them can make your survey efforts successful.
An example of a leading question you can ask in a survey is, "What activity feature would you like added to your local park—a basketball court or a pickleball court?"
Of course, the respondent may want something other than these things at their local park. Instead, they may want walking paths. But because the question contains limited choices, the respondent must choose one instead of answering truthfully about what they want.
Another example of a leading question would be, "You'd like a pickleball court in your local park, wouldn't you?" This question suggests that the respondent should agree that pickleball courts are an excellent addition to their local park.
The correct way to form these questions is, "What activity feature would you like to see added to your local park?” This open-ended question can now gather accurate information about what the respondent wants to see in their local park.
Researchers can construct survey questions with poor wording, irrational formatting, or the wrong question type for the information they're trying to extract.
Questions formatted like, "Do you not want a pickleball court, or do you want a pickleball court at your local park?" If the respondent answers yes, are they replying they want a pickleball court or not? Forming the question, such as "Do you want a pickleball court in your local park?" will tell how the respondent feels about pickleball courts when answering yes or no.
Another example of a confusing question is "True or false: Do you like to play pickleball, basketball, or walk along the trails when at the park?" You should format this question as a multiple-choice question rather than a true or false question.
Negative and double-negative questions can give an adverse impression of the researcher's intent. The other problem is the possibility of respondents quitting. This is because it will take them too long to sort through questions to ensure they provide accurate responses.
An example of a negative question is "True or False: I don't think pickleball courts should not be in the local park." On the other hand, an excellent example of a survey question is, "Do you think pickleball courts should be in the local park?" In this example, the question is easy to understand and straightforward.
Questions like these require that respondents answer yes or no to questions that have words like rarely, never, or always.
For example, "Do you always play pickleball at your local park?" This question can confuse the respondent about whether they should answer yes or no. They don't always play pickleball at the park, but they will always play pickleball whenever they have willing partners. Taking the always out of the question will result in an answer that they do play or don't.
Respondents can't answer ambiguous and vague questions because there are many components of the question to consider when trying to answer them.
For example, "Do you like the activities provided in the local park?" This question should be more specific. Narrowing down the question makes it easier for the respondent to answer, as some individuals may enjoy certain activities while others may not even participate.
Forming the question this way, "If you go to your local park, what activity do you enjoy best?" will result in a specific answer.
Assumptions about your respondent's knowledge or habits can also make answering questions confusing.
For example, you might assume your respondent goes to the park. In this case, you ask questions such as, "When you go to your local park, what activity do you enjoy the most?" However, the respondent may not even go to the park and doesn't even know what's there to do.
Asking the question, "If you go to the park, what activity do you enjoy most?" and adding a response option that allows the respondents to answer that they don't go to the park will eliminate an assumptive question.
Some of the types of questions mentioned can encourage bias. Whether it happens intentionally or due to lack of experience, this common survey mistake can influence the results in favor of a specific outcome or bias the answers. It may also limit the perspective to only one type of customer.
Either way, a survey done with bias, prejudice, favoritism, or for your own personal reasons will render your research invalid and a waste of time.
Including questions on the survey that may not give you the results you want wastes your time and the time of the people who took the survey. Instead, take the time to consider questions that will benefit your purposes. Read through the survey once you've completed it and test it for reliability and validity.
Do you want to discover previous research faster?
Do you share your research findings with others?
Do you analyze research data?
Last updated: 5 September 2023
Last updated: 19 January 2023
Last updated: 11 September 2023
Last updated: 21 September 2023
Last updated: 21 June 2023
Last updated: 16 December 2023
Last updated: 19 January 2023
Last updated: 30 September 2024
Last updated: 11 January 2024
Last updated: 14 February 2024
Last updated: 27 January 2024
Last updated: 17 January 2024
Last updated: 13 May 2024
Last updated: 30 September 2024
Last updated: 13 May 2024
Last updated: 14 February 2024
Last updated: 27 January 2024
Last updated: 17 January 2024
Last updated: 11 January 2024
Last updated: 16 December 2023
Last updated: 21 September 2023
Last updated: 11 September 2023
Last updated: 5 September 2023
Last updated: 21 June 2023
Last updated: 19 January 2023
Last updated: 19 January 2023
Get started for free
or
By clicking “Continue with Google / Email” you agree to our User Terms of Service and Privacy Policy