Survey Instruments - Designing, Creating, and Analyzing Surveys


Introduction

This article provides tips and resources for designing a survey, creating that survey in tools available to Michigan Medicine, and analyzing the response data.

Instructions

Designing a Survey

Step Number Phase

Details

1 Gather requirements

Start by asking yourself these questions...

What do you want to improve? The primary reason to conduct a survey is to help something improve, and these surveys tend to be recurring. The answer(s) to this question will determine the subject(s) of your survey questions. 
What group decisions do you want to make? Another reason you might conduct a survey is to facilitate decision-making for a group, and these surveys tend to be one-time. The answer(s) to this question will determine the subject(s) of your survey questions.
Is this a one-time survey or recurring? This will influence how you setup the survey. If it's a one-time survey, consider reimagining it as recurring, since one sample doesn't help you understand whether or not you improved anything. If it's recurring, consider creating a survey that has no end and allows people to retake it over and over. This will make analyzing the response data over time much simpler since it will already be completely integrated / self-contained (no need to merge the results of multiple surveys in order to see trends).

Who is your audience?

Understanding your audience helps you design your questions and helps you know who to send invitations to.

What demographics do you want to break the data down by?

This can be informed by your audience and might include team, department, age, race/ethnicity, gender identity, sexuality, length of employment, disability identity, etc. You'll need to write questions for each demographic you want to understand.

Will you take a random sample of your respondent population and ask them all to participate? Or will you invite everyone in your population to participate?

If you want to perform statistically-relevant research, make statistically-valid claims about the data (like what percent of your whole population agrees or disagrees with something), or compare your data against industry benchmarks, then you want a random sample with full participation.

If you instead want to simply look for problems, investigate them as they arise, and make gradual improvements, you should invite everyone to participate.

Do you want to track who responds or do you want respondents to have anonymity? 

If a response is anonymous, it's more likely that the respondent will feel free to fully express their thoughts and feelings. If it's identifiable, you'll be able to follow-up with the respondent. A middle ground can be to make the survey anonymous by default, but allow the respondent to disclose their identity if they choose to.

Through what channels would you like to advertize the survey and send invitations?

This may depend on you audience and what you think will be most likely to reach/engage them. There are many possible channels you can consider, like email, teams, Slack, a newsletter, printed posters / flyers, computer screensavers, and door-to-door. See a comparison of different distribution methods / channels.

Do you need people to help promote and support the survey? If so, who are they?

If you need to circulate printed materials or go door-to-door, thinking through that now will help you prepare for it.

Once you have the data, what do you plan to do with it and on what timeline?

Collected data is useless if nothing will be done with it, so think about the process you'll use to take action once you have data.

Who will analyze the data and who will review the results of that analysis?

Identifying those responsible for the analysis now will help you prepare for it.
Who will decide what actions to take based on the analysis and who will be accountable / responsible for taking action? Identifying those responsible for follow-up now will help you prepare for it.

How will you report the results and actions taken to the participants?

Demonstrating to participants that their input was valuable and led to actions will prove the utility of your survey and help maintain/improve response rate in the future.

 

2 Draft questions

Based on your answers above, write down the questions you want to consider including in the survey. As you do, pay attention to bias: phrasing or formatting that leads people to favor a certain answer. See examples of biased questions and how to fix them.

3 Draft answer options/format

For each question that you wrote down in the previous step, decide what format you'll use for the answers. For example...

  • Multiple choice:
    • Binary answers (e.g. yes and no) are best when you want the quickest answer possible to a question that isn't complex. Binary scales force a black-and-white choice onto emotions that may not be so clear, though, which can introduce bias and remove nuance. That lack of nuance can lead you to miss discreet changes in trends that could be important.
    • Rating scales (e.g. Likert: very unhappy, unhappy, neutral, happy, very happy) are best when you want to measure someone's attitudes, opinions, or emotions with more nuance than a binary answer can provide. They can can also lengthen the time it takes to complete the survey, however, and introduce bias of their own.
      • Use an odd number of scale points: Most researchers prefer an odd number of answers so that a clear midpoint can be established, and so that respondents can choose a neutral stance if they want to (forcing them to do otherwise would introduce bias).
      • Use 7 scale points if survey length/fatigue is not a concern, otherwise use 5 points.
    • Always include "n/a" as a final answer in multiple choice answer sets. Some survey tools, like Qualtrics, don't allow respondents to un-answer multiple choice questions after choosing an answer. That locks in an answer that the respondent might prefer to back out of (maybe they answered accidentally), and that introduces bias. Providing an escape clause with "n/a" solves this problem.
  • Free text. Asking an open-ended question like "Why?" provides feedback that's more actionable than multiple-choice answers, but it can also lengthen the time it takes to complete the survey.
4 Consider length Once you have a first draft of your survey, take it and time how long it takes. A survey that's too long might not be completed, or might be completed in haste, both of which introduce bias to your results. See a study into survey length, which concludes that the ideal duration is less than 8 minutes.
5 Complete the presentation

Consider adding...

  • A preamble that explains what the survey's focus is, why you're conducting it, and what will be done with the data. Understanding your goals can increase respondent interest and participation.
  • A closing statement that thanks the respondent for their participation and tells them when and how to expect to see results. 
6 Get feedback Once you've honed your survey for length, ask others to test it out and provide feedback. 

Creating a Survey

The table below provides how-to documentation and support contact information for survey tools that are commonly used across Michigan Medicine.

Tool How-To Documentation Michigan Medicine Support Team
Qualtrics

Creating a project
Qualtrics - Training and Support

HITS Service Operations Support
Microsoft Forms Create a form with Microsoft Forms HITS Productivity & Collaboration
Google Forms ITS documentation repository ITS Service Center
Google Surveys Create a survey
ITS Service Center

 

Analyzing Survey Responses

For discussions of how to analyze response data overall, see these articles from HubSpot, MonkeyLearn, Qualtrics, and Qualaroo.

For a discussion of analyzing free-form / unstructured data, see these articles from Displayr, Hotjar, and Survey Practice.