This post will highlight six types of survey biases to be cautious of when creating a market research questionnaire or when conducting any form of survey research.
Table of Contents:
What is survey bias?
Survey bias is an aspect of a survey that has a negative effect on the outcome of the results. It essentially means that some aspect of your survey could have swayed respondents into answering a certain way or providing certain feedback - which you never want to do. You always want to aim for the most authentic, genuine, and unbiased feedback from your research study as this will be the most useful for your brand to inform decision-making.
Though biases often occur subconsciously, meaning a researcher can create a bias without knowing or intending to, it's a very important element of your study to consider before sending a survey out for fieldwork. In this article, we'll look at the most common types of survey biases and how to avoid them.
Back to Table of Contents
Types of bias in surveys
Survey biases can be caused by the way respondents are sampled, how a questionnaire is designed, or how an interviewer poses a question.
One way to avoid researcher bias when designing survey questions or analyzing results is through the use of AI:
Below let's explore some specific types of survey biases you might encounter so you can recognize them when they appear:
Sampling bias
Sampling bias is a type of bias in which a researcher gathers a sample of respondents for a questionnaire that does not accurately represent the intended population (i.e. only surveying older generations about TikTok perceptions). As such, the survey results cannot accurately be used to make claims about a general consumer base. This form of bias could be a result of limited consumer accessibility to a survey, not accounting for a specialized/niche topic, or the location of those sampled.
-
How to avoid sampling bias
- Set quotas for your target sample
This ensures you are not sourcing from one specific demographic or group of people (i.e. all females, all millennials, etc.). Enforcing survey quotas means your survey respondents come from many different backgrounds, with varying perspectives. - Keep online surveys short and accessible to all
For example, ensure that surveys are mobile-friendly for survey respondents who may not have access to a computer, and are short enough so that even those with busy schedules or other jobs have the option to participate in your questionnaire. - Ensure your sample is randomized
This means all qualifying participants have an equal chance of being selected. To do this, use a random number generation or another form of random selection. Many panel providers use randomized sample selection when sourcing respondents to account for selection bias.
Back to Table of Contents
- Set quotas for your target sample
Non-response bias
Non-response bias is a type of bias that happens when a group of non-responders to a survey represents a large portion of opinions that vary drastically from the opinions of those who have responded. This results in alternative opinions being missed, so the findings give the wrong impression of how a total population views an issue.
A common example of non-response bias is seen during elections. Those who can’t make it to the polls due to polling location availability, work schedules, or childcare, may have opinions that are meaningfully different from those who show up to the polls - leading to a biased skew in the results. An example of non-response bias in the online survey world might be seen with lengthy surveys and incentives. Let’s say for example younger generations are open to taking longer surveys for the incentives and perks; when they realize a survey doesn’t offer one, they might drop out. Meanwhile, older generations may take surveys simply because they enjoy them, and don’t necessarily rely on an incentive. In this case, you are missing out on valuable insights from younger generations whose perceptions could vary drastically from older respondents.
Non-response bias can also be seen with sensitive information. If respondents feel they are being asked questions that are too personal, without the option to skip, they may just drop out of the survey altogether.
-
How to avoid non-response bias
- Use targeting
If you’re looking to capture a specific audience of survey takers, make sure those potential respondents are the ones receiving your survey. For example, college students are probably not going to click on a survey about work-life balance, just as homebodies are not as likely to click on a survey about traveling.
Panel companies are useful for this purpose, as they often have a set of pre-identified targets for commonly-sought survey audiences and can also make sure you have the appropriate screening criteria. - Keep it short
Lengthy surveys have a tendency to lead respondents to quit the survey before they’re finished. If your survey needs to be longer in length, consider adding section breaks throughout the survey, letting the respondent know how many sections or questions they have left. - Provide options
When it comes to sensitive questions like political affiliation or income, always provide the option for a respondent to skip over that question, rather than leading them to quit the entire survey as they may not feel comfortable providing this information. - Ensure survey delivery
Before sending your survey out to participants, send a test link to yourself or a colleague. Test it on multiple devices and browsers to emulate what a participant is going to be seeing on their end and to determine if anything needs to be adjusted to enrich the experience.
Once you confirm the links are working properly, keep track of the surveys you are sending out and the response rate you’re getting back. Online companies can often do this with download data or click-rate data while mail-in survey companies can monitor returned surveys that don’t reach their intended destination. Online surveys will also benefit from being mobile-friendly, to ensure those who are always on the go can easily respond.
In addition to monitoring the delivery rate of your survey, you can also send follow-up reminders. If you send your survey out and someone happens to be traveling, they may forget about it even though they are interested in responding to the questionnaire. A reminder will move the survey back up in their inbox and give them another option to participate - thus improving your response rate.
Back to Table of Contents
- Use targeting
Acquiescence bias
Acquiescence bias is where respondents have the tendency to lean toward positive responses more frequently than negative ones. It's also often known as agreement bias.
For example, acquiescence bias might appear when a respondent feels indifferent toward a topic but they select ‘strongly agree’ because they may feel that’s the ‘right’ answer - even though it doesn’t actually reflect their sentiment.
Reasons for acquiescence bias are vast; some respondents may carry their ‘always aim to please’ mantra with them into surveys, while others may try to ‘manipulate the system’ by thinking responding favorably will further qualify them for the survey and avoid getting kicked out. A question’s phrasing can also impact whether or not a respondent has acquiescence bias.
-
How to avoid acquiescence bias
- Avoid leading questions
When you start a question with “How much do you agree...” you could be priming a respondent to feel they need to have a polar feeling, when in fact they may feel neutral about the topic. This could result in 'extreme responding' (on extreme ends of a Likert scale).
As an alternative, consider phrasing questions such as: “How do you feel about the following statement: ...” - Emphasize anonymity
When respondents know their data collection is not going to be tied to their names or any parts of their identities, they may be more inclined to answer honestly rather than favorably. Using a disclaimer is especially useful in work settings, for honest and accurate feedback from your employees. - Include red herrings
While this won’t avoid acquiescence bias, it is a way to identify when this sort of bias has taken place within your survey data. For example, say you have a matrix question with a Likert scale across the top and statements down the side; if a survey taker selects they ‘agree’ with both of the following two sentences, it’s likely they are exhibiting acquiescence bias and you can flag them in your data set:
“I love to drink coffee”
“I hate to drink coffee”
Back to Table of Contents
- Avoid leading questions
Social desirability bias
Social desirability bias is a form of survey bias in which respondents answer questions in ways they think will be viewed favorably by others. It is similar to acquiescence bias in that respondents report metrics that don’t necessarily reflect their true sentiments but for a different reason. While acquiescence bias is typically limited to agreement biases, social desirability bias is a bit broader - not limited to agree/disagree or yes/no.
For example, survey respondents may underreport their alcohol intake or smoking frequency because society views high volumes of these activities negatively. Or, survey takers may give inaccurate answers about how frequently they work out at the gym. When social desirability bias occurs, it can lead to a problematic skew in your survey data.
-
How to avoid social desirability bias
- Consider survey response options
Rather than having a respondent pick from a defined scale, have them type in a value or a phrase themselves. For example, have respondents type in the number of days they work out per week in an open numeric table, rather than providing options to choose from a list. This use of open-end formatting could lead to more accurate data. - Keep it anonymous
Like acquiescence bias, respondents are more likely to respond authentically if they know their identities are not tied in any way to their survey responses. Provide respondents with a disclaimer at the beginning of your survey that their data will only be measured in aggregate, never on a personal level. - Ask neutral questions
If a respondent gets the idea that one end of a scaled question would be considered more favorable, they may lean toward that one when answering. For example, instead of asking ‘How much do you like cats?’, asking ‘Which animals do you like best?’ is a more neutral way to understand how people feel about cats.
Also be sure to avoid extreme wording in your questions that can be considered positive or negative, such as ‘How much time do you waste on your phone?’. The word ‘waste’ has a negative connotation in this type of question, which could easily sway how a respondent reacts. Instead, pose the question such as: ‘How much time do you spend on your phone?’.
Back to Table of Contents
- Consider survey response options
Question order bias
Question order bias is when the flow of survey questions influences how a respondent will react. Asking certain questions early on in your survey design can sway a respondent into how they later answer questions. As a simple example, asking respondents about Netflix and then in a later question asking them to name streaming platforms could show bias toward mentions of Netflix.
You can think of question order bias almost as leading questions. It’s prepping a respondent so they already have something top of mind, rather than capturing their true conscious thoughts.
-
How to avoid question order bias
- Start with broad survey questions
When honing in on a topic, always start general and narrow in. For example, if you’re surveying respondents about breakfast cereals, first start by asking if they typically eat breakfast, then ask what kinds of breakfast foods they eat, then ask about cereal brands they may consume.
Contrarily, if you started by asking respondents which cereals they consume from a list of cereal brands, and then later ask what they typically eat for breakfast, they will likely say cereal - because it’s on their mind from the earlier question; that’s question order bias. - Use randomization
Randomization is helpful when it comes to choice questions (i.e. multiple choice). Randomizing a list of answer options helps to mitigate the risk of respondents choosing an option simply because of its location (i.e. first in the list or last in the list).
Randomization is also useful for entire questions; this is especially useful during concept testing where you are showing visual stimuli. If you’re testing multiple images or videos, and respondents are always seeing a certain concept first, that could be influencing how they feel about the second one. By using randomization, your respondents won’t all see the same concepts first, providing more validity to your findings and removing potential question order bias.
Back to Table of Contents
- Start with broad survey questions
Interviewer bias
Interviewer bias is a form of survey bias in which a moderator’s own opinions interfere with the feedback from a respondent during qualitative survey interviews (i.e. video surveys, focus groups, etc.). This type of bias could be positive or negative, intentional or not, but regardless it’s a form of bias to be aware of.
Interviewer bias can come in many forms - stereotypes, demographic profiling, confirmation bias (in which the interviewer seeks to confirm a pre-conceived notion they have about a candidate), or recency bias (in which the interviewer shows a preference for the most recent candidates they interviewed rather than considering the total candidate pool). These are just a few examples of interview bias. Below are some tips on how to avoid it:
-
How to avoid interviewer bias
- Switch things up
Use a pool of moderators or interviewers (rather than just one) in qualitative research environments to diversify opinions, personalities, and potential implicit biases. - Use an interview guide
By providing interviewers with a guided set of questions, there’s less risk of bias inserting its way into a conversation. In addition to providing a set list of questions, a guide might also include neutral responses/reactions to keep the conversation flowing in a natural way. - Schedule peer reviews
In addition to switching up interviewers and sticking to a guide/script, it can be beneficial to periodically have peer review sessions in which a colleague sits in on another moderator’s interview process and shares notes afterward indicating where potential biases may have come to light; this is helpful for reflection so that the moderator is more aware of such biases during their next interview.
Back to Table of Contents
- Switch things up
Summary
As humans, we all have our own biases; being aware of these biases, especially in market research, will ensure valid, truthful responses from your survey sample. quantilope prides itself on high-quality data, which includes a focus on survey bias. One way to reduce the risk of survey bias is through the use of quantilope’s automated market research survey templates. These templates have been thoroughly thought-out and designed by quantilope’s data science team - taking into consideration question order, question phrasing, methodology, and respondent experience.
Clients can leverage these various types of survey templates to feel confident their survey research is set up correctly and in an unbiased manner. To learn more about quantilope’s automated survey templates, or about how to limit types of bias in survey research, get in touch below: