A well designed survey starts with well defined goals. When the goals of the survey can be expressed in a few clear and concise sentences, then the design of the survey becomes considerably easier. One of the best ways to clarify a survey is to determine how you intend to use the information. This question should be asked of every question on the survey. One of the temptations in survey writing is to ask questions that a simply “interesting to know,” but that will have no benefit because they don’t coincide with the goals of the study.
Asking the Right Questions
Another pitfall in survey writing that is related to asking questions just because the response might be interesting is to ask too many questions. The more questions a survey has, unless completing the survey is compulsory, the lower your response rate is going to be. Read each question and ask “How are we going to use this information?”
It is important to set accurate expectations for the survey respondent. The subject of the survey should be clearly and briefly explained in an introduction to the survey. The length of the survey and/or the amount of time it should take should also be explained, “This survey consists of 12 questions and should take approximately five minutes to complete.” For online surveys, a progress bar is helpful. Respondents should also be offered an incentive for completing the survey if appropriate.
The first few questions in the survey should be short, easy and non-threatening. However, avoid asking questions that are too boring and will cause the respondent to quickly abandon the survey. The most important items should also be placed in the first half of the survey as partially completed surveys are often returned. The second half of the survey should consist of more probing questions and open-ended questions. Demographic questions should typically be at the end of the survey.
At the same time, the survey should flow naturally. Items should be grouped into coherent categories. Any questions that ask subjects of a more personal manner should also be placed toward the end of the survey as responses are less likely and people are more likely to answer them if they are already in the mode of answering questions.
Asking Questions the Right Way
Years ago Coke asked the question "Which of these two colas tastes better to you?" People preferred the taste of #2, the company changed its formula and New Coke was introduced. However, America hated it. We were outraged. It wasn’t that New Coke was a carefully crafted marketing strategy to remind us how much we love old Coke, it was a genuine screw-up, fueled by millions in research. A screw-up that happened largely because the wrong question was asked. A better question would have asked “If we changed the taste of Coke to be a little sweeter, like Pepsi, would you buy it?” or “What would your reaction be if we changed the taste of Coke to be a little sweeter, like Pepsi?”
Another common error seen in question writing is the double barrel question. This is really two questions asking for one answer. An example of a double barrel question would be “Were you satisfied with the quality of our food and service?” If the respondent answers "no," was their response for the quality of the food, the service, or both?
Questions should accommodate all possible answers. A question about cell phone ownership asks “Which of the following cell phone providers are you currently with?” A)Verizon, B)AT&T, C) Sprint, D) T-Mobile. While most respondents may fall into these four categories, there are several other providers (Cricket, Boost Mobile, TracFone, etc.). Every possible option should be included. If every possible option isn’t know an “Other” category should be listed.
Another mistake in survey design is to make an unwarranted assumption. An example of this type of error would be, “Are you satisfied with your current auto insurance? (Yes or No)”. What if someone doesn’t have auto insurance? Questions should apply to everyone. This can be done by adding a category (Yes, No, Don’t have auto insurance) or by adding a pre-qualifying question. “Do you currently have auto insurance?” (Yes, No) followed by the “are you satisfied question.”
Some questions simply have a tendency to give inaccurate information and there isn’t always a good solution in how to ask them. For example, the question “What percent of your monthly budget do you spend on groceries?_____” is likely not going to elicit an accurate response. Most people won’t know the answer, and few will take the time to look it up. These type of questions need to be analyzed with this in mind.
Don’t make the mistake that survey respondents know everything you do. A survey about political issues asks “Are you in favor of proposition 7?” The respondent might not know what proposition 7 is about. Additionally don’t use abbreviations others might not be familiar with. “What was your AGI last year?” An okay question if you are an accountant, but not for the general public.
Leading questions are another problem. Sometimes they are placed in a survey accidentally, and sometimes intentionally. An example of a leading question would be “Don’t you think that government spending is out of control?” or “Most people think that government spending is out of control, do you agree?”
When dealing with a personal or difficult subject, where people may be hesitant to reveal information, lower the impact by grouping responses into multiple-choice. For example, instead of asking “Do you have roaches in your home?” ask, “Have you ever found any of the following in your home: ants, spiders, mice, roaches.”
The wording of a question can affect responses. Show people a picture of a car crash and ask them, "How fast were the cars going when they smashed into each other?" and people will name a higher speed than if asked, "How fast were the cars going when they made contact?"
Finally, when using a Likert scale, the physical placement of the “undecided” or “neutral” category can change response patterns. If it is placed off to the side, survey respondents are more likely to select it than if it is placed in the middle.
One final test before sending out a survey is to have someone other than yourself take the survey. They can often spot things about your survey, its flow, and the way questions are worded that you would otherwise miss.