You might think that any survey question—because it helps you gain valuable insight and data from your respondents—is a good one. But it’s important to ask the right questions. When respondents take a survey filled with good questions, they are more likely to have an enjoyable experience and finish the survey. When a respondent comes across one or more bad survey questions, it might rub them the wrong way, eventually causing them to abandon your survey.
This article teaches the difference between good and bad survey questions. Learn how to avoid bad questionnaire examples and examine examples of bad survey questions with us in this article. Continue reading to find out how to recognize real examples of bad surveys and questions you should avoid using to create an engaging and relevant study.
1. Leading Questions
When examples of bad survey questions are the topic, leading questions are usually one of the first culprits. Leading questions use biased language that influences the participant’s answer selection. The problem with leading questions is that they might seem innocuous, but are actually fishing for a certain answer. When seeking objectivity, biased questions are the worst. Unbiased survey questions should be strived for, as well as unbiased answers. Biased surveys won’t give you accurate data to work with when making your next business decision.
Here are some bad survey examples that use leading questions:
- Did you enjoy our delicious new milkshake?
- Are you upset with how long shipping takes?
- How much did you enjoy our excellent customer service today?
Why it’s harmful:
Leading questions distort your results. They don’t truly measure the respondent’s perspective but rather reinforce a narrative embedded in the question.
How to avoid:
Avoid using adjectives to describe the subject of your question. Use neutral language. Instead of making assumptions, ask:
- What did you think of our new milkshake?
- What is your opinion of the delivery time for your recent purchase?
- How would you rate the customer service you received today?
This allows for a full range of responses.
2. Loaded Questions
Loaded questions contain built-in assumptions that may not apply to all respondents. These assumptions can make respondents feel confused, alienated, or even offended. For example, the question “Where do you like to go swimming?” assumes everyone likes to go swimming and knows how. Some respondents will have no choice but to abandon the survey or give inaccurate answers. You should avoid writing questions like these to prevent misinformation in answers.
The only time a loaded question is appropriate is if a preliminary question is asked first so that logical branching can be used to skip over people the question doesn’t apply to. For example, you can ask “Do you like to swim?” With the answers, make sure there is an option for “I don’t know how to swim.” If the person answers “yes,” then you can go ahead and ask them where they like to swim. Using conditional logic in your surveys shows respect for your audience and produces better data.
3. Double-Barreled Questions
One of the most common survey mistakes is asking a double-barreled question. This type of bad survey question forces respondents to answer multiple questions simultaneously. This isn’t the best way to acquire usable data. Your survey questions should always ask one question at a time and have an exact answer for each question. If you ask more than one question, you won’t know why respondents answered that way or which question they were answering. This is an example of terrible survey questions, especially for Likert scale questions (which we explain here).
Examples of good and bad double-barreled survey questions include:
Poorly written survey question:
- How satisfied or dissatisfied were you with your hotel room and dinner on your trip?
- How satisfied are you with our website navigation and checkout process?
How to avoid:
Split them into two questions:
- How satisfied are you with the website navigation?
- How satisfied are you with the checkout process?
By isolating each element, you gain more precise insights—and a better understanding of what needs improvement.
4. Absolute Questions
An absolute question does not allow the survey participant to provide useful feedback. An example of a poorly worded survey question is “Do you always watch TV?” It pushes respondents into a corner. They might feel like no answer fits their reality, leading to guesswork or frustration. With only a yes or no option, you’re likely to receive “no” from every participant. A better way to ask this question is to remove “always” and give a selection of answers.
An example of a good survey question using frequency scales:
- “How often do you watch TV?”
- Every day
- A few times a week
- Once a week
- Rarely
- Never
Scales help you understand usage patterns without forcing extreme answers. Avoiding absolute questions will not only make your survey participants happier, but it will give you better data to work with.
5. Unclear Questions
A good survey question is a simple one that is easy to understand. If a respondent has to think about your question longer than a few seconds, it’s probably too difficult to understand and they can’t answer it honestly. Make sure to use simple language and avoid clichés, abbreviations, slang, catchphrases, and colloquialisms. You should also avoid any words perceived as potentially offensive.
Examples of unclear survey questions:
- Does your Medicare plan include OTC benefits?
- How do you feel about our new system?
The first question uses an abbreviation that not everyone will understand. For best practice, spell out everything and then add an abbreviation in parentheses for reference. The second question doesn’t specify which system or what aspect (speed, usability, design) is under review. Respondents may interpret the question in different ways. One might comment on design, another on functionality, and another on speed—making it impossible to analyze data effectively.
How to fix it:
Be specific and targeted:
- How satisfied are you with the speed of our new ticketing system?
- How easy was it to navigate the new dashboard?
Precise questions produce more focused, actionable feedback.
6. Poor Answer Scale Options
Even the most thoughtfully written survey question can fall short if the answer choices are poorly constructed, unbalanced, or unclear. The response scale is just as important as the question itself—because that’s where the respondent expresses their opinion.
Example of a poor question and answer options:
- How satisfied are you with our service?
- Good
- Okay
- Bad
At first glance, this might seem usable. But in reality, this scale is deeply flawed:
- “Good” and “Okay” are both somewhat positive but not clearly different.
- There’s no option for “Very good” or “Very bad,” which compresses all extremes into a single vague choice.
- The wording is informal and lacks emotional or intensity gradation.
This setup forces users to choose between unclear options, leading to ambiguous data that can’t be reliably interpreted or tracked over time.
A well-constructed Likert scale ensures that all perspectives—from extremely positive to extremely negative—are represented equally:
- Very satisfied
- Satisfied
- Neutral
- Dissatisfied
- Very dissatisfied
This scale:
- Covers both ends of the emotional spectrum
- Includes a neutral midpoint
- Uses consistent, professional language
- Allows easy analysis and comparison
7. Jargon and Technical Language
One of the most common—and most overlooked—mistakes in survey design is using language your audience doesn’t understand. Jargon, industry terms, acronyms, and technical expressions may feel natural to your internal team, but they can be confusing.
Bad example (B2B tech survey):
“How would you rate the clarity of the RESTful API endpoint documentation?”
Unless you’re surveying experienced developers, terms like “RESTful,” “API,” and “endpoint” will make little sense to the average respondent. This can result in skipped questions, random answers, or total survey abandonment.
Bad example (e-commerce survey):
“Were the PDP and cart UX consistent with your expectations?”
Here, “PDP” (product detail page) and “UX” (user experience) are terms common in internal teams—but meaningless to many everyday shoppers.
Consequences of using jargon:
- Lower completion rates – Confused users often abandon the survey.
- Inaccurate responses – People may guess or misinterpret the question.
- Eroded trust – Your survey feels like it’s “not for them,” reducing engagement.
- Skewed data – Responses become unreliable and difficult to analyze.
How to write for your audience:
- Know who you’re surveying - A developer audience might expect technical language, but customers, partners, or vendors probably won’t.
- Use plain language - Replace jargon with accessible, everyday words.
- Spell out acronyms - The first time you mention something like “NPS” or “UX,” write: “How likely are you to recommend our service (Net Promoter Score – NPS)”
8. Random Questions
If your survey is all about a customer’s experience on your website, don’t throw in a random question asking them about their favorite food. This is one of the common examples of bad survey questions. Not only is the question out of context, but it’s completely random and irrelevant. If you want to know their favorite food so that you can offer a product related to it, give some background context before you include a random question in your survey.
Light personal questions can be helpful at the end of long surveys to boost engagement—but only if clearly marked as optional and relevant. Revisit your survey objective. Ask yourself:
“Does this question help me make a decision or understand my audience better?”
If not—cut it.
9. Double Negative Questions
Double negative questions are one of the most confusing—and surprisingly common—mistakes in survey writing. They occur when a question uses two forms of negation, forcing the respondent to interpret what’s essentially a logic puzzle. Instead of providing thoughtful answers, users spend their time trying to decipher what the question is actually asking.
Bad example:
- Do you disagree that the product isn’t effective?
This question is extremely difficult to process. Does “yes” mean:
- They think the product is effective?
- Or that they disagree with the idea that it’s ineffective?
This type of structure forces respondents to pause, reread, and overthink. Many will either answer incorrectly or abandon the survey in frustration.
The key to fixing double negatives is simplicity and clarity:
- Do you think the product is effective?
- How satisfied are you with the product’s performance?
- What improvements would make the product more effective?
10. Net Promoter Score (NPS)
This is one of the most common mistakes in survey design. NPS is a metric that asks customers if they would recommend the product or service to others (typically on a scale of 0-10).
Ask one of the most popular customer survey questions, such as: “How likely are you to recommend our company?” to examine customer satisfaction.
This can be useful if your business has more than one product or service and you want to know which ones perform better than others. But if you only have one product or service, it doesn’t make sense to ask this question. If someone uses your product or service and likes it, they’ll give you a high score anyway. You don’t need to ask for their opinion about whether they’d recommend it—just look at your conversion rates.
However, some professionals suggest that it’s best not to ask questions requiring respondents to make predictions; instead, focus on what happened in the past (or what’s happening now). So, for example, instead of asking about the likelihood of recommending your company, ask about the likelihood of recommending specific products or services within your company.
Why Bad Survey Questions Hurt Your Data
Poorly worded questions don’t just annoy people—they actively destroy data quality. When respondents are confused, led, or alienated, your insights become unreliable. Even worse, you could take action based on false conclusions.
Consider the impact of a survey where:
- Leading questions inflate customer satisfaction
- Unclear phrasing creates mismatched interpretations
- Poor answer options hide valuable feedback
Data like this can push you in the wrong direction, wasting time and resources.
Good data starts with good questions. That’s why investing in clear, thoughtful survey design is critical.
The Cost of Poorly Worded Surveys
Whether you’re paying for a tool, hiring a research firm, or spending internal resources, poorly designed surveys cost you money—and can damage your reputation.
Common costs include:
- Low response rates
- Incorrect targeting
- Wasted marketing or product investments
- Loss of customer trust
Respondents who encounter a confusing survey are unlikely to finish it. Worse, they may associate the frustration with your brand. That’s a missed opportunity to build relationships and collect insights.
A well-designed survey builds trust. It shows that you value your customers’ time and opinions.
Avoid Bad Survey Questions Using SurveyPlanet
Now that you’ve seen examples of good and bad survey questions, we hope you can differentiate between good and bad questionnaire examples.
Explore our infographic Dos & Don’ts below to see more examples and gain even more insight into this topic. SurveyPlanet makes it easy to create your first survey. Upgrade to SurveyPlanet Pro for added features, including question branching, export tools, and uploading images. Check out our plans & pricing, explore our beautiful themes, browse survey examples, and find other options when you log in or create an account.
Photo by John Jennings on Unsplash