Contents
Your Ultimate Guide To Writing Amazing Survey & Poll Questions
Whether it is your first time creating a survey on your own, or you are a seasoned research professional, one thing is for sure: if you can’t write good survey questions, you won’t get helpful consumer insights.
But just what is a “good” survey question anyway? In this guide, we break down the four key elements that you need to master in order to write better survey questions.
A “good” survey question is:
- One that achieves your survey and data goals.
- One that is clear and direct.
- One that carefully considers user experience.
- One that is free from bias.
Throughout this guide, we will show you the tactics you can use to make sure you are delivering quality survey questions. To help you along, we will also show you what not to do, so you can easily spot the difference between a good question and one that needs some work.
For example, let’s say we want to avoid bias. Which of these is the better choice?
This is an example of a direct, straightforward question that has eliminated bias. Nice work!
This is an example of Prestige Bias. By letting the respondent know that an authoritative source says smoking causes cancer, you are biasing the respondent to agree with the source.
Try out these question quizzes at each section to test your knowledge as you go. Let’s get started.
Set (and Achieve) Your Survey Goals
Everyone wants to write good survey questions–they want to be clear, direct, engaging and get respondents excited about answering.
But these are not the most important reasons to take care when writing survey questions.
The reason is if you build a survey with the wrong questions or question types, you could get stuck with a bunch of data you can’t use.
That’s why it is important to set data goals in advance and tailor your survey to achieve those goals from the very beginning.
How To Set Survey Data Goals
On the surface, this may seem simple: just state what you want to learn from your respondents and go from there.
But different surveys deliver different types of data. So you need to know how you plan to use the data you get from your questions first.
For example, let’s say your goal with your survey data is to create a map so you can continuously survey consumers every month and map their responses over time.
For this to work, you will need to combine demographic data with raw numbers of respondents and their answers.
You also want to ensure that your questions are simple and not prone to shifts in understanding based on geographic location or time passing.
Knowing these data goals, which of these two questions would be better for your survey?
By adding a specific type of summer location, you assume that your respondents go to the beach or have been recently enough to have a favorite activity there. This will bias your survey towards people that live closer to a beach. Also, by allowing an open-ended question at the end, you will make one answer choice unusable for your map.
By keeping responses general, you avoid bias towards one geographic area. And by having a fully quantitative survey but also including a “Something Else” answer choice, you don’t force your respondents to give an answer that doesn’t apply to them, while still ensuring you only get data you can use for your map.
In order to discover the question types that may work best for your survey goals, check out this sample results page. Do an export of the data so you get a sense of what you will have to work with once your survey is completed.
Next, you’ll want to check with your stakeholders to ensure that you are aligned on the questions you want to get answered, how you want to use and ingest survey data and what you are hoping to be able to deliver at the end.
How To Ensure You Meet Your Goals
With Pollfish, you have 11 different question types to choose from. All of these question types deliver data slightly differently.
It’s also critical to deliver these questions in a structural order that makes sense to the respondent and reduces bias from their answers (more on that later).
Depending on your survey goals, you’ll want to choose these questions wisely to ensure you have a data set you can use.
The most important difference between survey question types is the difference between Quantitative and Qualitative research.
From Qualitative vs Quantitative survey questions
Quantitative research is about collecting information that can be expressed numerically. Researchers often use it to correlate data about specific demographics, such as Gen Z being more likely to focus on their finances than Millennials. Quantitative research is usually conducted through surveys or web analytics, often including large volumes of people to ensure trends are statistically representative.
Even when the survey audience is very large, quantitative research can be targeted towards a specific audience, usually determined by demographic information such as age, gender, geographic location.
Qualitative research focuses on personalized behavior, such as habits or motivations behind their decisions. This can be gathered through contextual inquiries or interviews to learn more about feelings, attitudes, and habits that are harder to quantify but offer important additional context to support statistical data.
When quantitative and qualitative research are paired, a complete set of data can be gathered about the target audience’s demographics, experience, attitudes, behaviors, wants and needs.
Based on this description, can you tell which is an example of a qualitative question type?
Qualitative surveys don’t have set answer choices. Instead, they allow respondents to speak in their own words about experiences with your product or service.
Qualitative survey questions allow respondents to speak in their own words about your product or service without selecting from pre-selected answer choices. You can use these questions to gain a deeper understanding about your consumers and their feelings towards your product outside of what you’d normally ask them. Feel free to ask more broad questions and let them open up.
But differences between quantitative and qualitative are not the only differences between question types. You can still gather a broader range of emotions from respondents without going to a fully qualitative survey.
Question types like matrix questions, where respondents can rank the importance of different product or service features can provide more quantifiable data while still giving a broader range of interest than a simple one-by-one ranking.
Lastly, some researchers choose to combine qualitative and quantitative research. By allowing an open-ended “other” field to single selection questions, for example, researchers collect quantifiable data, while still allowing the option for further context.
Based on your research goals, this added field can either provide needed context for deeper understanding of your audience, or create unnecessary noise that can’t be quantified and throw off the data set.
It all depends on you. So think it all the way through and have a plan in place before you begin creating your questionnaire.
Question Writing 101: Clear, Direct, Well-Tested
When writing anything, clear, direct communication should be the ultimate goal. You want to ensure that, regardless of any flowery language or interactive content features, the core of what you are trying to say is communicated loud and clear.
With survey questions, this becomes even more important.
Once you have selected your survey goals and you have a good idea of what kind of data you want to collect, the goal of your survey questionnaire is to get respondents to completely fill out your survey as accurately as possible.
Market research best practices show that it is best to let respondents decide if they want to finish your survey or not. With Pollfish, respondents can escape the survey at any time. Therefore, all responses are voluntary and of higher quality than if respondents were forced to finish the survey.
What that means for you as the researcher is you have to ensure that your questions don’t drive respondents away.
Don’t worry, Pollfish is here to help you with that. First of all, each questionnaire is reviewed by our research experts prior to launch. But you can help by following these three steps to survey question writing: be clear, be direct, and test on real people.
Be Clear
In order to get the types of answers you expect, you must ensure that your target audience will fully understand what you want from them.
What does that mean? Below are a few common mistakes researchers make when writing their first questionnaire.
Too Much Jargon
When writing for a niche audience, industry jargon can be a welcome addition. Things like common abbreviations and loaded terms help build rapport and allow you to establish a personal connection with your readers.
Not so in market research.
Imagine you are reading an article and you come to some jargon you don’t recognize. All it takes is a quick Google search and you have learned some new terminology.
But if you come to a survey question that contains an acronym or abbreviation you don’t recognize, not only will you feel like you can’t answer that question, you will feel like maybe you aren’t qualified to answer the rest of the survey. You may attempt to guess at the meaning, or escape the survey all together.
Even if your audience is screened to only be in your industry, don’t assume everyone uses the same terminology you do. Take care when using any kind of jargon, acronym or abbreviation in your survey questions.
Extra Explanation
Most survey question types are fairly self explanatory. If there are 5 answer choices and the instruction is “Select One”, there isn’t much room for interpretation there.
But when questions get more complicated, extra explanation is essential.
For complicated concepts or ideas that respondents may only have some ideas about, a “description” question type can be helpful in clarifying which part of the idea a respondent should focus on.
Matrix questions are one question type where you have to be careful of your labeling and explanation. Let’s take a look at two Matrix questions and see which is the most clear.
How are your respondents supposed to know if 1 is good or bad? When you use matrix questions, ratings scales and any other rating or ranking question, be sure to label questions clearly.
As you can see, everything here is labeled so it is very clear for each amenity what you are choosing when you select each answer. When you use matrix questions, labels are very important. From the row titles to the answer choices, make sure it is very clear what respondents are selecting.
Questions that contain media like images, audio files and videos also often require further explanation. Let’s take a look at the video questions below and see which is the most clear.
With questions containing required media, it is always a good idea to give explicit instructions as to what to do with that media. Once you have done that, you want to be very specific with what you ask respondents to deliver. If your ask is too vague, you are likely to get a lot of short, unthoughtful responses.
When you use Pollfish for video questions, respondents are required to watch the entire video before answering. Still, you want to be clear about what you want. If you want them to watch the entire video, say so. Also, be specific with what you want them to pay attention to. Otherwise, you are likely to get a lot of vague, unthoughtful responses.
If it is necessary to consume all media to understand the question being asked, make sure your instructions clearly state this. You can even include your media as a separate question.
Create non-leading questions
It can be tempting to write questions in a way that seems obvious to you, but survey questions should not follow the same patterns we have when speaking to each other.
Although plain language is advised, inserting your own opinions—even subtly— into the survey question can lead respondents to feeling more inclined to agree with the answer you want.
The second question remains neutral towards either show. Since the goal of your research is to uncover the most accurate data, you want to avoid pushing respondents towards one answer over another.
Make Answer Choices Clear & Distinct
Once you have made sure your questions and instructions are clear and direct, make sure your answer choices are distinct from one another.
One common reason respondents bounce from a survey is they cannot decide between two similar-seeming answer choices.
This can also reduce data quality because it may result in respondents offering answers they don’t really mean.
Although “yes” and “no” can seem like two distinct answer choices, these also don’t offer gradients within the answer options to appeal to respondents who don’t feel strongly or are unsure of an answer.
We recommend avoiding “yes/no” questions in favor of a Likert Scale or by writing answers out entirely to ensure less ambiguity and reduce bias.
Which of these questions seems correct?
This question not only assumes the respondent drinks beer, but drinks beer enough to know the difference between these varieties. As a researcher, you don’t even know at this point if your respondent drinks at all. Answer choices need to be clearly distinct to all potential respondents, unless you have narrowly filtered your audience to ensure their knowledge.
While this question is not as specific as it’s counterpart, it allows all possible respondents a chance to answer, while even making concessions for those who don’t drink at all. Once respondents answer here, you can get more specific (while ensuring the ability of respondents to participate) by using skip logic. Have the respondent answer and, for the ones who select beer, send them on to a question about which variety they prefer. That way, you can be more assured they will know the difference.
Be Direct
Good survey questions contain two parts: the question and the instructions.
For many Pollfish survey question types, some level of instruction is built in. But you want to be sure that when you deliver instructions, you are speaking directly, in clear, declarative statements.
Using too much passive language and indirect phrasing can create unnecessary confusion. Let’s take a look at the two examples below. Select which you think is the most direct.
While these two questions may seem to say the same thing, using softer, more imprecise language opens up the potential for confusion. Using very direct, plain-spoken language removes any such ambiguity. Or to put it more simply, it makes questions (nearly) idiot-proof.
While these two questions may seem identical, the indirect language here leaves room for confusion. For example, who am I choosing for? Without a direct subject, I am left to wonder why I might be inclined to choose one flavor over another. Maybe when I buy ice cream, I only buy it for my husband because I am on a diet. If you want your respondent’s personal preference, ask for it, as clearly and directly as possible.
Always keep your Survey Goals in mind
As we learned in part 1, survey goals are at the core of all research projects. To make sure you’re writing the right survey questions to meet the right goals, consider the flow and structure of the survey and how the respondent will be presented with each question. Your goals may also dictate the type of question respondents should be given.
Example: Survey goal is brand awareness
It’s best to begin a brand awareness survey with an open-ended question for respondents to recall brands that are top-of-mind, prior to being presented with a series of questions that might trigger their recall.
If the survey goal is press for a specific cereal brand, ranking questions can help uncover consumer preferences.
Ranking questions ask respondents to order answer choices, forcing them to choose one over another. These are best for establishing stronger opinions or validating a strong preference.
The data from a ranking question could be “85% of consumers believe that Trix is the best children’s cereal”, providing a good headline for a research story.
Matrix questions allow respondents to apply a ranking scale towards each cereal, but don’t force them to choose one over another or compare them. These are a better question choice to measure other sentiments.
The possible data to come from a matrix question could be that nearly ¾ of 90’s kids recall Frosted Flakes’ “Tony the Tiger”, however, the Cheerios bee has largely been forgotten.
Test On Real People
While it is important to align with all stakeholders internally on survey goals, you also want to test out your final questionnaire on some real people.
This can mean either sharing it with co-workers who are not familiar with the research project, or emailing your questions to a friend.
Getting outside help can be essential in identifying unclear directions that cause confusion for your respondents.
Respondent Experience: Think Like An App User
Because nearly everyone has an internet-enabled device with them at all times, survey companies have identified the opportunity to use this technology to reach respondents.
Unfortunately, not all surveys are created equal when it comes to offering a good respondent experience on mobile devices.
Poor respondent experience can have disastrous consequences, impacting your survey response and completion rates, and even impacting the data quality of your study.
So how can you follow best practices for mobile surveys?
Build on a Mobile-First platform
By using Pollfish, you are already off to a great start. That is because every Pollfish survey question type is optimized for mobile out of the box.
Our surveys are not only optimized to fit your mobile screen, they are delivered exclusively inside mobile apps, meaning each question type is designed by app developers to mirror the experience of an app.
Pollfish also provides a mobile previewer for each question in your survey, so you can test how your question will look on a phone in real time, right inside the questionnaire builder.
<Maybe add a screenshot of the questionnaire builder here?>
Shorten your questionnaire
Because so many users prefer to take surveys on mobile devices, it is important to think like a mobile user.
Think about your app usage–do you stay in one place on your phone for a long time, or do you jump quickly from place to place? Do you often have your phone open for hours or do you use your phone in quick bursts?
Mobile phone users want to move fast. They could be on the go. They could only have a few minutes to kill while waiting in line at the bank.
That’s why it is important, when you can, to keep surveys short.
Shorter surveys get faster results and ensure full engagement with your questions from start to finish.
Our survey experts review every survey before it is published as well, so they may recommend you shorten parts of your study if they feel respondent experience may be impacted.
Keep a mobile mindset
While all our survey questions will look great on mobile, there are a few things you can do to make your questionnaire more engaging to the mobile user.
Start by favoring question types that are easier to answer on mobile phones. Multiple-choice questions that users can tap, rather than open-ended questions where typing is required makes answering easier for users. Question types like rating stars and using video can keep users engaged and motivated throughout your survey.
Keep in mind that while many survey tools that offer a “mobile survey” experience also offer matrix questions, you’ll want to verify that these have been designed for mobile distribution.
Matrix questions in a typical survey present as a table where respondents are asked to rank a series of answers. In mobile, these don’t translate into a good experience for respondents.
Ensure that your mobile matrix question utilizes vertical scrolling and doesn’t require respondents to zoom in or drag the table horizontally to see all the answer possibilities.
While more qualitative questions may be helpful to provide more context and emotion to survey respondents, mobile respondents would rather tap or swipe than type.
If you were a mobile user, which of these would you prefer to answer?
Prevent Survey Bias
One of the most important parts of building the perfect survey is reducing survey bias.
From Your Ultimate Guide To Survey Bias
Put simply, survey bias impacts data quality. By eradicating survey bias (as best you can), you build trust in the data you’ve collected.
That’s no small thing, as companies across the world are using consumer insights data to inform everything from content creation to product timelines to strategic planning.
So how do you remove bias from your questionnaire?
Well, for starters, it is important to remember that preventing all survey bias is impossible.
That’s right. We are all biased. We have unconscious thoughts that cause us to subtly indicate our true feelings to those around us. And that applys to writing survey questions as well.
The key is to do everything you can to remove the most pernicious forms of bias from our surveys so we can ensure top-quality data.
With this section, I’ll take you through a few of the more common ways to avoid bias in your questionnaire.
Remove Double-barreled questions
Double-barrelled questions happen when researchers blend two questions into one, and then allow for only one answer. This creates a fallacy for the respondent and often results in incomplete or biased answers.
Can you tell which question is biased here?
While this question may not appear to cover as much ground as it’s counterpart, the other question here doesn’t allow the respondent to choose different answers for the Pool and Bar. It is always better to split double-barreled questions into two separate questions to ensure specificity.
This question doesn’t allow respondents the ability to select different ratings for the pool and bar. Even if you consider the pool and bar to be the same, a respondent may not. Double-barreled questions provide incomplete answers. You’ll risk losing critical consumer insights. Make sure you split them in two.
Avoid Assumptions
Depending on how you write a question, you could be assuming things of the respondent, or even forcing them to give an answer they don’t believe.
One way researchers fall into this trap is by asking loaded questions. Loaded questions make multiple assumptions about a person and then force them to answer a question based on those assumptions.
Can you spot the Loaded question?
This question presupposes the respondent likes attending baseball games, because they have to fit the premise of the question in order to answer. Before you can ask this question, you must ensure your respondents like attending baseball games. You can do this by asking such a question and then adding skip logic so that only those who have previously attended baseball games will be shown this question.
Before a respondent can answer a question about snack preferences at a ballpark, you have to first establish that your respondents attend or have attended baseball games. You can ask the above question and apply skip logic so only those who fit the premise of your intended question are shown it.
Don’t Use Biased Language
Similar to Loaded questions, biased language (sometimes called Leading questions) pushes respondents towards one response or the other.
This removes the impartial nature of a survey and influences respondents, leading to poor data quality.
Can you spot the Leading question?
The words “new” and “old” bias the respondent towards saying the newer design is easier to use. You need to ask questions from a neutral position and do your best to avoid biased language that lead respondents towards one answer or another.
By asking, simply and directly, which design is easier to use, without letting the respondent know which is newer or older, you ensure that you remove bias. By adding responses like Not Sure or No Difference, you open up the question to additional answer possibilities for a broader range of respondents.
Do Your Best To Include All Available Choices
As a general rule, you want to avoid forcing respondents to give an answer they don’t really mean.
So when you are writing a closed-ended survey question with a finite answer pool, make sure you think hard about additional answer possibilities that may not come immediately to mind.
Once you have done that, there are two additional ways you can reduce bias on these kinds of questions.
The first is to add an open-ended “Other” field, where respondents can include any additional choice you may have forgotten. If there is enough consistency in responses, you may even be able to manually quantify the data. If not, you have still provided some much-needed context to the current answer choices.
If you want to stick with your closed-ended question type, you can make it clear in your instructions that respondents are to select the answer closest to what they think. This is imperfect and provides a tough-to-quantify variable to your data. But it also will keep respondent experience and completion rates up.
Do you want to distribute your survey? Pollfish offers you access to millions of targeted consumers to get survey responses from $0.95 per complete. Launch your survey today.