Contents

Section 1: Sampling Bias

Section 2: Questionnaire Bias

Section 3: Survey Fraud

Section 4: Pollfish Can Help!


Survey Bias: Your Ultimate Guide to Reducing Bias and Increasing Data Quality

Survey bias is a broad term. It encompasses all parts of a survey that can have bias. And, in case you didn’t know, there are a lot of trouble spots in the average survey. 

But not to worry. Once you know what to look for, and have the right tools to root it out, survey bias will quickly become a thing of the past for your market research team. 

But first things first: why should you care? 

Put simply, survey bias impacts data quality. By eradicating survey bias (as best you can), you build trust in the data you’ve collected. 

That’s no small thing, as companies across the world are using consumer insights data to inform everything from content creation to product timelines to strategic planning.

But where to begin?

Disclaimer: Preventing All Bias Is Impossible

Before we go any further, we should clear something up: no survey is 100% free of bias. Things like the incentives or influences of the questionnaire writer or the words used in various forms of sampling bias can negatively affect results when a questionnaire is distributed to hundreds of users. 

Not all bias is preventable, but there are a few key areas and best practices you can use to prevent bias in your survey. 

Once you have done everything you can to ensure your survey is taken from a representative sample, has a variety of unbiased questions, and is delivered to real consumers, you will be ready to get great consumer insights.

Let’s start with the basics.

The Different Types of Survey Bias

Survey bias falls into three categories: Sampling Bias, Questionnaire Bias, and Survey Fraud. There are other kinds of bias out there to watch out for (some of which we will get into at the end) but most forms of bias you should watch out for fall into these three categories. 

We’ll start with sampling bias.

What is Sampling Bias (and How to Avoid It)?

Sampling bias, quite simply, refers to bias that occurs in the selection of your survey respondents, also known as your sample.

There are many different types of sampling bias to watch out for. 

Representation Bias

If you select your sample of a town based on a respondent’s presence at the mall, you could be excluding key members of the local population. For example, you may be over-representing healthy people of a certain age who get out of the house, people with the disposable income to shop, etc. 

How to avoid it

Digital sampling methods help with this. By collecting responses from users on their phones, through social media or by email, you avoid the types of sampling bias that an in-person surveyor would contend with. If you want to take it one step further, Pollfish Survey Stratification can help (more on that later). 


Post-stratification allows results to be weighted for census data in a single click, reducing sampling bias.

Exclusion Bias

Removing trial participants can create sampling bias. If, for example, your survey requires follow-ups but excludes original participants who have since moved out of the study area, you could skew the results.

How to avoid it

Keep initial and follow-up samples the same. 

Pre-Screening Bias

This occurs when you advertise for volunteers within particular groups. 

How to avoid it

One way to avoid this would be to reduce targeting on your survey collection method of choice, and instead use screening questions on your survey to screen out people who may not have the qualifications you are looking for.

Survivorship Bias

An over-emphasis on “surviving” respondents can create sampling bias. For example, doing an economic study of the business climate and only including current businesses, without including ones that failed.   

How to avoid it

When doing customer satisfaction surveys, for example, it is important to get feedback not just from your current customers–who are likely satisfied–and potential customers–who don’t know how they feel about you–but from your former customers who you likely failed to satisfy. This way, you can truly know how satisfying your product or service actually is.


 More On This…

weight-survey-resultsHow to weigh your survey results in Pollfish

 


Other Ways to Avoid Sampling Bias

Stratified sampling

Stratified sampling is the process of dividing a population into subgroups and selecting a balanced sample audience from those groups to decrease the sampling error. These populations are usually large and diverse. Because random sampling from a given audience can be unbalanced, stratified samples apply weights to audience characteristics against a control group to ensure that the responses are representative of a given population.

Examples

  1. If you wanted to survey the population of Dallas, TX, you could weight your survey sample audience demographics against census data for Dallas to create a more accurate reflection of the demographic makeup of that city.
  2. If you were conducting a political survey and needed representation of various minority groups (such as race or religion) that may not appear in a random sample, stratified sampling could ensure that some members of those groups are included, and their responses adjusted to present a proportional segment of the population as a whole.

Quotas

Quotas are another way to weigh an audience and reduce sampling bias, but instead of applying a weighted number to audience characteristics to estimate proportionality to the population, quotas fix the proportions of an audience so that responses are collected from people who fit the exact criteria. While they typically take longer, quotas offer more precise data. They’re better used for a smaller sample population where the weighted demographics are known.

Examples

    1. In a survey for a magazine where readership was known to be 80% female, quotas to survey an 80% female audience would be used.
    2. If a product is geared towards millennials, age quotas might be used to ensure that the survey is only completed by a randomized sample of those between the ages of 25-40.

What is Questionnaire Bias (and How to Avoid It)?

When building your survey questionnaire, the way you ask a question matters just as much as, if not more than, what you are asking. 

Questionnaire Bias (also known as Response Bias) can come in many forms. Where most researchers get in trouble is they bias respondents unintentionally, due to how they ask a question, what question type they use, or how they administer the questionnaire. 

 

Self-Selection Bias

Also known as Non-Response Bias, Self-Selection Bias can occur when respondents are allowed, through whatever means, to opt-out of a survey due to its content. For example, not everyone who eats at a restaurant is forced to leave a review. Therefore, most restaurant reviews are biased toward people who had a particularly good or unpleasant experience. You also see self-selection bias when there is a knowledge gap on a topic. If the topic or subject of the survey is known and people can opt-out, the survey will get more participants with advanced knowledge of or interest in the topic or subject. 

How to avoid it

You can avoid this type of bias in two ways–you can either remove the ability for participants to opt-out once they’ve seen the brand or topic of the survey, or you can keep the brand or topic hidden throughout the survey.  

Acquiescence Bias 

Also referred to as “yea-saying,” Acquiescence bias occurs when there are too many questions asking participants to agree or disagree. Respondents tend to overly agree, even when this causes contradictory responses. 

How to avoid it

Using a variety of question types will remove this type of bias. Don’t rely too heavily on any one question style. And if you can, review responses and throw out any with deeply contradictory “yea-saying.”

Order Bias

The order in which you list selections in a single-selection or multiple-selection question biases respondents towards choices higher on the list. Similarly, responses change based on the order survey questions appear. 

How to avoid it

If your survey platform can shuffle questions and/or responses, take advantage of this feature. You will instantly remove this type of bias from your survey.


Shuffling answer order reduces bias

Extreme Responding

Some respondents just like extremes. So, if given the choice, they will always select the most extreme option. 

How to avoid it

Like acquiescence bias, Extreme Responding is the result of too many of the same type of question. If you are going to use scales or sliders that welcome extreme responses, use other types of questions to gauge user feeling as well. This will give you a more nuanced understanding.

Social Desirability Bias

There are different forms of this bias. For example, participants sometimes believe there is a way in which they are “supposed to” act when taking part in a survey. Typically, this comes from respondents who believe in the importance of research and want to be good subjects, so they will try to discern what the researchers hope will happen in the study and give them the answers they want. Or the respondent will overly report behaviors they believe the researcher may want to hear for other reasons. The most famous example of this is in political polling in the US in 2016, where subjects hid their intention to vote for Donald Trump from pollsters.   

How to avoid it

You can start by keeping the brand or topic hidden from participants, and just asking them to respond so they don’t know who the answers are for. Secondly, omit leading or loaded questions that may convey an intent or interest.

Question Types to Avoid

Loaded Questions

Loaded questions make assumptions. For example, “What is your favorite snack when you go to the ballpark?” is a loaded question, as it assumes the respondent likes sports, has been to a ballpark and has eaten snacks at a ballpark. 

How to avoid them

Assume nothing. If you want to survey people who enjoy snacks at the ballpark, try a screening question or two. This will ensure you are only asking people who fit the profile. 


Set targeting first and screen respondents from a qualified audience.

Leading Questions

Leading questions use biased language to push respondents toward a response. 

How to avoid them

Keep questions simple and remove any language that may imply intent. Complimenting or insulting a product or service should be off-limits. If you can, have a friend who is unfamiliar with the subject, review your questionnaire.

Double-Barreled Questions

Double-barreled questions happen when researchers blend two questions into one and then allow for only one answer. This creates a fallacy for the respondent and often results in incomplete or biased answers. For example, “Did you find the product interesting and helpful?”

How to avoid them

Make sure each question asks for a single answer. If you have double-barreled questions, they should be split into two questions, i.e. “Did you find the product interesting?” and “Did you find the product helpful?”


 More On This…

How to write good survey questions

 


Prestige Questions

These questions insert knowledge from widely accepted experts into questions, which biases respondents to agree with the expert. For example, “Doctors say smoking causes cancer. Do you agree?” or “Do you support the President’s policy on Zimbabwe?”

How to avoid them

Avoid asking questions beyond the cognitive grasp of the average person. If your question assumes a level of knowledge, use screening questions to remove people who don’t consider themselves knowledgeable. If you provide information on a topic you are asking about, be sure to remove bias from your explanation and only ask participants to respond to the information included. When possible, use multimedia (video, audio, images), as these can be less biased than adding prestige to the information.

Negative Questions

These questions force respondents to remember a traumatic event or think about something negative. Asking about health issues, car accidents, money woes, or even divorce all qualify.

How to avoid them

If these types of questions exist in your survey, you can remove them (as they will bias respondents towards responding more negatively), or you can move them to the end.

Stereotype Questions

Researchers sometimes unintentionally stereotype their respondents. For example, reminding respondents that racial stereotypes exist around driving or that gender stereotypes exist around math ability may bias respondents towards acting more stereotypical, or trying to defy those stereotypes instead of delivering honest responses. 

How to avoid them

Remove this kind of language from your survey, and don’t ask respondents to describe themselves by race, ethnicity, age, and the like, unless you have to. 

Absolute / Pushy Questions

Questions with Yes or No answers can bias respondents because they don’t allow for any range in between. This may cause respondents to not answer or answer Yes when they mean Only Sometimes. For example, do you eat breakfast in the morning, yes or no? If someone doesn’t eat breakfast 100% of the time, they may answer “No” whereas another person may answer “Yes” because they do sometimes eat breakfast. The result is imprecise and biased responses. Taking this even further, some researchers will make a declarative statement and ask the respondent to agree or disagree (Spain is too hot in the summer, agree or disagree?). This presents bias because survey takers who don’t have a strong opinion will be more inclined to agree with the statement. 

How to avoid them

Remove Yes or No questions from your survey. Then, read your questionnaire from a user perspective and remove all statements that may lead the user toward one response or another. Remove all declarative statements, and simply ask questions.

Ambiguous / Confusing Questions

Unclear wording, referring to earlier questions, and using indirect language are just some of the ways researchers confuse respondents. For example, “Was your breakfast not incorrect when it arrived at your table?”

How to avoid them

Read your survey questions aloud. This will allow you to hear how they may sound to a reader. You should be able to spot examples of confusing language. 

What is Survey Fraud (and How to Avoid It)?

Traditionally, market researchers would assemble panels– groups of people who have agreed to take surveys. In most cases, panelists were convened in person and offered some sort of semi-lucrative incentive. Think airline miles, gift cards, and even cold, hard cash. Panelists were convened in person, preventing things like panelists from taking the same survey multiple times.

However, researchers eventually sought more randomized sampling. Enter random digit dialing, where an auto-dialer machine would call random people to collect responses for surveys. This reduced bias as respondents were selected at random and screened after the fact.

Then came the internet. 


 More On This…

Random Device Engagement and Organic Sampling

 


Suddenly, research panels re-emerged. Now online, backed by the global reach and offering respondents all over the globe the ability to take surveys for pay, Panels were back with a vengeance. 

This incentive structure, combined with the scale of these efforts, created an opportunity for fraud. Survey takers, looking to accumulate as many rewards as possible in as short a period as possible, would take as many surveys as possible, barely reading the questions before providing answers. 

While most reputable panels seek to remove these fraudulent responses, panel participation has been on the decline, and panels often fall short of the number of respondents necessary to create a representative sample for their customers. 

This requires panels to bridge this gap wherever they can find it by buying samples from wherever they can find respondents. When they lose control of the panel vetting process in-house, they have less information about the respondents, recruitment methods, or whether they are a suitable choice for the survey’s target audience.  

But who are these problematic survey takers and how can you spot them? 

At Pollfish, we created AI and Machine Learning algorithms to find, locate and remove these fraudulent responses (more on that later.) 

If you see these types of responses popping up in your surveys, make sure you get rid of them as they don’t contribute to a representative sample. 

Professionals

Sometimes companies pay people to take surveys. The amount per survey is often low, so professionals take a ton of surveys to make money as part of their overall income. These respondents will often have accounts on multiple sites, multiple aliases, etc. Their fast, haphazard movement through your survey can be disastrous to data quality.    

How to avoid them

Don’t use traditional panels. Survey platforms like Pollfish deliver your survey inside mobile apps, to active consumers. Respondents receive in-app benefits, not monetary incentives. 

Rule-Breakers

Rule-Breakers skip key instructions, offer incomplete responses, or contradict themselves. 

How to avoid them

It can be hard to know if this is due to speeding through towards a reward or just a mistake, but either way, if there is too much rule-breaking, it is best to filter them out. 

Speeders

Much like professionals, speeders can’t possibly have completed your survey accurately in the time they took. But if you aren’t tracking for this, how would you know? 

How to avoid them

Set a time threshold and timestamp respondents. If you find that they moved through your survey too quickly to successfully complete it, filter them out. 

Straight-Liners

Straight-liners are some of the easiest fraudulent respondents to catch. These respondents give the same or similar answers to every question. Whatever the first choice is, that’s what they select. 

How to avoid them

If you spot this in your responses, or your platform can flag it, remove these respondents as there is a high likelihood their responses are fraudulent or unhelpful to your study.

Near-Straight-Liners

Sometimes survey takers know they may be filtered out and not get a reward if they answer like a Straight-Liner. So near-straight-liners will answer all but 1 or 2 questions in a straight line.

How to avoid them

Same as above. Remove. 


 

Straight-Liners, Near-Straight-Liners, and Alternators will select their answers in familiar patterns to complete surveys quickly. The Pollfish Survey Methodology uses AI and Machine Learning to screen out these low-quality responses.

 


Alternators

Remember the kid in class that didn’t study for the test, so they would just fill in the bubbles on the scan sheet into diagonal lines (think A, B, C, D, C, B, A) all the way down? That kid was an alternator.

How to avoid them

These can be a little harder to spot. It would help if your survey platform could do this work for you. 

Bots

For the more computer savvy fraudster, creating bots to take surveys is the easiest way to game the system. While most popular panels have installed protections against bots and have abandoned particularly vulnerable methodologies like River Sampling (using banner ads as a recruitment process without verifiable participant info), these panels may still buy respondents from companies who don’t do enough to protect against bots. 

How to avoid them

Find out where your survey respondents are coming from. Make sure none of the providers in your sample are using outdated methodologies like River Sampling. If your survey platform has bot detectors and can throw out common bots, all the better. 

Fake Accounts

Just like bots, survey takers will sometimes create fake accounts, allowing them to take the same survey multiple times. Most survey panels check IP addresses to protect against this, however, these protections are easily circumvented. 

How to avoid them

Check if your survey platform allows accounts to take a survey more than once or repeat or restart a survey once they have started. Make sure to have a mechanism in place to filter by IP address or look for other patterns that may indicate fake accounts. Surveys that allow link sharing are more likely to suffer from repeated survey attempts, while those that use a randomization method such as RDD or RDE approach the respondents and use a unique identifier to prevent repeats.

Biased Respondents

This last one is impossible to prevent if you are working with a traditional online panel. Forums and other online meeting places allow biased respondents to find pre-recorded answers to your survey shared by other respondents. This allows them to quickly answer surveys without adding their own, natural feedback. 

How to avoid them

The internet is a big place and you, as a researcher, cannot easily protect against this. You can prevent survey takers from cutting and pasting inputs, you can prevent screenshots and other measures that prevent sharing responses. But if you want to prevent this problem, it may be best to avoid survey panels all together.

Pollfish Can Help

AI / Machine Learning

Pollfish AI and Machine Learning remove bad survey takers. Read more about that here.

Individual Review

Pollfish individual review (by a person) ensures your survey follows best practices. Each survey you submit is subject to a brief (30-minute) review to ensure you have not egregiously biased your study.  

Survey Stratification

Recently, Pollfish launched Survey Stratification, which allows you to tie the weighting of your survey sample to the US Census. With the push of a button, you can weight your respondents to match the Census demographics. This data is available on a national level in the US, as well as in all 50 states. It is also available across the EU and UK, as well as Iceland, Liechtenstein, Norway, and Switzerland. This feature can reduce sampling bias. Read more about this here.