What do people want? What do community members like or dislike? As planners, these are the types of questions we generally ask as part of any planning process. Obviously, one way to find out is by conducting a survey. I previously wrote an article about the American Community Survey (ACS) which is an example of a comprehensive survey used to collect data for planning and funding allocation purposes. While creating a survey may sound simple, it really is not. It is much more than just coming up with some questions and then having some people answer them. Over the years, I have learned that it takes careful planning and consideration to put together a good questionnaire and implement an effective process of collecting information from our constituents. While I cannot say that I know how to develop the perfect survey, I am at least more confident now that I can avoid my past mistakes. As a guest lecturer, I had several opportunities to share my experience in developing and conducting surveys with graduate students at the University of Southern California (see “Teaching Planning and Policy“). With this article, I would like to put down in writing the key lessons I have learned about doing surveys.
Is a Survey Necessary?
Survey research is a good way to gather information from respondents in order to understand and/or predict some aspect of behavior of the population of interest. It is often used by planners to determine what community members want, like, dislike, prefer etc. However, there are times when a survey may not be necessary or feasible. For example, a survey may not be needed if the answers to your questions can be obtained from information that has already been collected. Thus it is important to find out whether similar studies or surveys have been done in the past. Another reason for not conducting a survey is that obtaining the information we need would require too much time or cost too much. We would have to make a decision with our time, resource, and other constraints in mind.
How to Write Good Questions?
It is easy to come up with questions. However, it takes effort and pretesting to write good questions that allow us to get the data we are seeking. Good questions are clear and answerable for respondents. When crafting questions, we must have the target respondents in mind. We have to anticipate their receptivity to different types of question formats and their willingness to answer them. We may, for example, want to ask open-ended questions to obtain more qualitative information, but such questions typically work better in face-to-face interviews. Also, speaking from my own experience, it is time-consuming and challenging to tabulate responses to open-ended questions because they can vary so much. Some open-ended questions may even be converted into close-ended questions: we can provide a thorough and mutually exclusive list of options for respondents to consider or rate rather than asking them to volunteer responses. For example, “what are the top three issues concerning parks and recreation?” may be asked as an open-ended question with a blank space for respondents to write their answer. However, it can easily be changed to a close-ended question if we provide a list of potential issues (such as lack of parks, safety, maintenance etc.) that respondents can select from.
Another issue to consider in question preparation is to estimate how many total questions can feasibly be asked. Asking too many questions hurts the response rate as respondents may get tired of answering them and just stop, i.e. “survey fatigue”. Thus while we may want to ask more questions, we need to keep in mind the negative impact a lengthier survey may have: some people may quit when a survey is too long. My suggestion is to limit a survey to two pages in length if possible. This, of course, does not mean that we cram as many questions as we can on two pages; the text has to be of a reasonable size! We must also be thoughtful and careful about the response options we offer. For example, if a questionnaire only asks standard questions about traditional park amenities and recreational programs, it could limit the imagination of respondents and would not collect input on less conventional or new creative ideas for meeting recreational needs. This is an important lesson I learned as part of the Florence-Firestone Community Parks and Recreation Plan preparation process. (For more information, please see Chapter 4 of my dissertation.)
Pretesting a survey instrument with a sample of respondents is very important. Essentially, it helps us to figure out whether our questions are clear and answerable. While we as the preparer of the survey may think that it is fine, we do not really know until other people start filling it out. I regret that I did not pretest the survey questions I had previously administered; had it been done, we would have reworded or even eliminated some confusing questions. Based on my experience and review of survey research literature, here is a list of common mistakes in writing questions that we should avoid:
- Some of the terms used are not familiar to some respondents. (Avoid planning jargon!)
- The response options are not exhaustive and mutually exclusive.
- The question asks about more than one thing.
- The question contains double negatives.
- Asking respondents to rank too many items or do some other difficult task.
- Including unnecessary neutral (i.e. “don’t know” or “no opinion”) option.
- The wording of a question seems to steer respondents towards a particular answer choice.
How to Conduct the Survey?
The Florence-Firestone parks and recreation survey I referenced earlier was not a scientific survey with a representative sample as the questionnaires were not completed by randomly selected households, but rather mostly park and library patrons. This is because the surveys were only made available at County parks and libraries in the community. To address this limitation, we changed the way we distributed surveys as part of our current Community Parks and Recreation Plans effort. Instead of just leaving surveys for people to pick up and complete at parks and libraries, we have more proactively distributed them at different locations and events to ensure different segments of the community are represented. Also, instead of just handing out the surveys, staff were available to help respondents complete them and clarify any questions they might have. This is our attempt to do “stratified sampling” in which the chosen sample is forced to contain units from each of the segments or strata (youth, seniors, men, women etc) of the population.
There are, of course, other ways of collecting survey data: mail, e-mail, web, and telephone. The Census Bureau uses all of these methods to make it easier for people to respond to the ACS (see this page). However, local governments do not have the resources to do this and must therefore select only one method to distribute surveys. So which one is the best way to go? This is a difficult question to answer because each method has its advantages and disadvantages.
- Mail: The ability to locate respondents is high because a complete list of addresses is usually obtainable. This method also yields less response bias when the questions are sensitive. However, some disadvantages are lower response rates, response bias toward more educated respondents, and higher non-response rates for individual questions.
- E-mail: E-mail surveys only work if you can obtain e-mail addresses of targeted respondents. We can potentially collect data faster and more efficiently via e-mail, but the following factors have all contributed to a declining interest in e-mail surveys: confidentiality concerns; filters and firewalls that prevent unsolicited e-mail; and increases in viruses sent via e-mail.
- Web: Web surveys such as those created using SurveyMonkey are quite common these days. There is some concern with giving out information over the internet, but the data is actually more secure than that provided via e-mail. People are understanding this better, as more are willing to complete forms and make purchases online. The problem with web surveys is that they are limited to populations that use the internet or have web access.
- Telephone: Phone surveys were often credited with high response rates. However, this is less true today as more people are less willing to respond to questions on the phone. (Personally, I would not be willing to stay on the line to answer questions from a stranger.) Also, it is expensive to conduct a phone survey and more time-consuming to write and test questions.
To decide which method to use, we must at a minimum take into consideration the following: budget and time constraints; and the characteristics of the population (such as educational attainment and access to the internet). For our current Community Parks and Recreation Plans project, we decided to set up tables and distribute the surveys in-person at various locations and events because we thought it would be more effective for us to go to where the people were, rather using the more passive and/or more technology-reliant methods described above.
How to Present Survey Findings?
We need to put as much effort and thought in how we present the survey findings as how we design the survey itself and write the questions. Unfortunately, this does not always happen as I have seen many reports with endless tables and charts that overwhelm and/or bore, rather than intrigue and inform, the reader. As we decide which data to report and how to report them, we must anticipate what the audience is most interested in seeing, how detailed or sophisticated an analysis the audience expects and needs, and how long the report the audience prefers. In terms of displays, I typically use a combination of tables and charts that highlight the key findings. If you are conducting a web survey, SurveyMonkey can actually generate a variety of interactive graphs for you as explained here. Personally, I am a big fan of infographics that creatively and effectively display information collected from surveys. An infographic I came across recently and really like is one prepared by Metro to present the results of an annual customer satisfaction survey. Another good one is an infographic done to highlight the results of a parks plan survey conducted at the 2012 National Recreation and Park Association (NRPA) annual conference. Both are visually attractive and easy to read.
As planners, we need to know what our communities want. Conducting a survey is a great way to find out. As I shared above, it is not easy to implement an effective survey effort. We must be very thoughtful and careful in how we write the questions, format the survey instrument, distribute the surveys, and present the findings and results. While it does not cover every step of a survey process, I hope this article has provided some helpful guidance and tips for planners who are thinking about or are in the process of developing a survey.
Images of Florence-Firestone parks and recreation survey by author
Image of Metro infographic from http://media.metro.net/projects_studies/research/images/infographics/metro_infographic_02.pdf (Fair Use Doctrine)