Subscribe Today

Get our best practices delivered straight to your inbox.

It's not rocket science.

August 1, 2024

Propelling rockets to the moon and designing surveys: two actions not commonly referenced in the same sentence. One relies heavily on engineering and technology to design, construct, and operate rockets and spacecraft. The other, an artful use of psychology and sociology focused on gathering and analyzing data about human behavior, opinions, and experiences. Both require careful planning, precision, and attention to detail to ensure success.

Sending a rocket to the moon involves incredible complexity. Clear mission objectives, detailed mission planning, and the proper trajectory are critical components to executing a successful launch and return. After all, there’s a reason rocket science is used as a comparative barometer indicating degree of difficulty for any given task.

Similarly to planning a mission to the moon, effective survey design starts with clear objectives, defining and planning what information is needed and how to obtain it via target population and methodology.

Hats off to all the rocket scientists out there, but the good news is, when it comes to survey design it isn’t rocket science. It’s behavior science! Grab your moon boots (remember those?) and come with us on an exploration of how to use behavior science to guide survey design that’s out of this world.

Providing an ideal user experience in survey design.

Precision is mission critical when it comes to survey design. As with the level of detail required to engineer a rocket ship, precision plays a major role in proper survey design. 

Bias in survey design can set you up for failure, before you’ve started up the ignitions. Precision in wording questions and response options is crucial. Poorly worded questions can lead to misinterpretation and biased responses, compromising the survey's validity. Being mindful of the various ways bias can be introduced into survey design is key. 

There are several types of survey bias and here are three critical areas to focus to ensure the design of the survey provides accurate and relevant responses free from influence or leading phrases.

  • Question wording: how language is used to write questions
  • Response options: the options available to answer questions asked
  • Survey format: the sequence of questions and length of survey

While these variables can still contain potential sources of bias and error - it's highly unlikely to completely remove bias from survey design - it mitigates the risk of wielding unintentional influence on how a respondent answers.

Don't lead the witness.

  • Avoid leading questions that suggest a particular answer or contain assumptions 
  • Remove loaded questions that contain emotionally charged words or assumptions
  • Minimize complexity so that questions are not difficult to understand

Here's an example of how a question can lead a respondent: Would you agree that we produce the best kind of survey question?

This question leads the respondent to answer in a particular way with the phrasing "would you agree" and the use of the adjective "best".

A more neutral approach to phrasing the question is: How would you rate the construction of this question on a scale of 1 to 5, with 1 being "worse" and 5 being "best"?

Give relevant and distinct response options.

Decision fatigue when taking surveys impacts the output of survey respondents. Ensuring response options are succinct, clear, and realistic can lead to more detailed insights and more completed surveys.

  • Keep rating scales balanced with a limited range.
    • 1 to 5 scales are better than 1 to 10. Why? The latter can leave respondents uncertain on how best to respond with so wide a range. With 1 to 5 scales, rating something as"low, middle, best" is much clear than with 1 to 10 rating scales.
  • When multiple choice options are available, make sure they are relevant and realistic. When possible, provide an "other" option to allow respondents to cover an option that is not readily available.
    • When asking about ice cream flavors, we would start with the most popular options (Vanilla, Chocolate, Strawberry) with the opportunity to gain nuance in a follow up question. If a respondent selects "Chocolate," ask a follow up question like "What type of Chocolate (Regular, Mint, Rocky Road, Chocolate Chip) rather than including all of these options in the first question to avoid decision fatigue.
  • It's best to ensure response options are distinct enough from each other so a respondent doesn't feel confused by similar options.
    • When asking about a favorite ice cream flavor, offering both chocolate and fudge would be too confusing rather than vanilla and chocolate.

For best results, keep it short and clear.

When designing a survey, even the order of questions can influence how a person responds to a question. Keeping a clear and neutral position means ensuring the entire survey as constructed isn't leading the respondent into a line of answering that isn't reflective of their experience or perspective.

  • Ensure the order of questions does not influence responses.
    • To continue the ice cream metaphor, if we begin a survey by first asking people to rate their opinion of Chocolate, and then later ask for their favorite flavor - we might accidentally bias them to choose Chocolate when they prefer Vanilla.
  • Ensure the survey is as succinct as possible so as not to promote respondent fatigue, rushed or incomplete answers, or drop-offs.

Fail-safe behavior science

While you can’t control survey respondents’ behavior, you can design a survey with desired participant engagement as a guide. Keeping these bias contributors top-of-mind will help.

Non-response bias

  • Problem: When members of the selected sample are unable or unwilling to complete a survey resulting in a non-representative sample population. This bias blunder was famously cited as a prime contributor to the results of the Literary Digest Poll of 1936 regarding the Roosevelt v. Alf Landon election. 
  • Solution: Some of the reasons for non-response bias are unavoidable, however, designing a concise and easy to understand survey will increase the likelihood of survey completion. Additionally, ensuring survey distribution methods are comprehensive and considerate to all population subsets.

Social desirability bias

  • Problem: When a respondent desires to conform to perceived social norms and/or appear favorably to the interviewer or other participants while being observed (Hawthorne effect).
  • Solution:  One of the largest contributing factors to social desirability bias is the method of survey. In-person interviews or surveys over the phone can heavily contribute to this type of bias due to the pressure of interacting with a person. Designing an online survey while also ensuring neutrality and simplicity with the wording of questions, aids to avoid any significant social desirability bias. 

Interviewer bias 

  • Problem: When the interviewer affects the participant's response (ex. A woman interviewer asking a man their opinions on abortion could vary from a man asking the same questions). 
  • Solution: Designing a survey that eliminates the need for in-person interaction, may alleviate respondents’ concern with interviewer perception of their response. Utilizing online survey tools, such as chatbots for surveys (shameless plug) is a great option.

--

It’s not uncommon for modern polling to ask respondents to give “yes” or “no” answers to complex topics. This makes it hard to capture nuanced beliefs and sentiments, which provide much more robust data about an individual’s preferences and motivations. 

CommonAlly has mastered the art of revealing “the why behind the why” through use of highly engaging survey tools such as VideoAsk, Likert Scale, and chatbot surveys. Connect with our team to learn more.

Julie Sandler
wrote this
.
She
are doing other important things as our
also does other important things as
Managing Director, Growth & Marketing
.

More Common Insights

Scratching the surface and beyond

Olivia Gladu
-
4
min read

Ever wondered which is better between polls and surveys for market research? Here's our insights.

Read On

Three ways to master market research

Aaron Lyles
-
4
min read

Sample size considerations can be a key component of well-executed research.

Read On

Get data insights directly.

Subscribe to receive monthly insights on best practices on survey design and data visualization, the power of data insights, and outreach strategies that work.

No spam, notifications only about new products, updates and freebies. You can always unsubscribe.