NLPES Question of the Month
What’s been your experience with online surveys? (Have you conducted surveys online? If so, what lessons have you learned? If not, why haven’t you done so, and would you consider doing online surveys? Do you have any observations about the components of a well-designed online survey?
Don Bezruki, Wisconsin
In Wisconsin we have conducted two different types of online surveys: email surveys in which the questions are contained in the email, the recipient checks the responses and hits the reply button; and we have done what I call web-based surveys in which the recipient of the email asking them to participate clicks on a link that takes them to our website, they enter their password and complete the survey.
The email mail surveys had the higher response rates and were very easy to develop. The downside is that you have to enter the responses yourself into a database for analysis. The web-based ones were much harder to design because of the human nature desire to have bells and whistles, and they had lower response rates. The advantage was there was no entering of the responses. As with any survey, I believe the most important things are to keep it short and field test it.
Ashley Colvin, Virginia
From our experience, online surveys have proven to be a robust and flexible research tool, although technical glitches can occur during the administration. Over the last four years, we have used some form of electronic survey, starting with Excel-based surveys that we emailed to respondents. Last year we conducted our first web-based surveys, and our analysts are now using software which allows them to design and build surveys themselves.
The primary advantage of online surveys is that we can now conduct a much larger number of surveys, as well as reach a larger and more diverse audience. Moreover, there is no data entry. We are presently conducting one survey with a respondent pool of 33,000 firefighters and emergency medical technicians that would have been impossible in any other format. By placing the survey on-line, we can quickly make it available to very large groups of people, including people outside of state government, and easily analyze the responses.
However, throughout our experience with electronic surveys, we have encountered technical difficulties, and being prepared for these can make the process easier. Typically, these problems have occurred on the respondent's end, and may result from their use of older computers or versions of Windows. With the current survey, we have found that about half of the problems can be resolved by instructing the respondent to decrease the security settings in Internet Explorer (the survey uses "cookies", which their browser must accept). There are also some people who appear to stop taking the survey if they encounter technical glitches, as evidenced by the number of partially-completed surveys. Despite these technical problems, we have been happy with online surveys, and expect to continue using them.
Angus Magiver, Montana
We began using online survey tools to collect information/opinions in 2003. So far, we have used online surveys on two occasions and we are still in the process of evaluating their benefits and formulating guidelines for use. The first use of an online survey was in an audit of professional and occupational licensing boards where we used a survey to obtain input from professional licensees (private citizens).
The second use of online surveys was during an audit of caseworker workload in the Department of Public Health and Human Services (respondents were primarily agency personnel). The following summarizes information about the two surveys and what we learned from their use.
Email contact procedures: Both surveys used email to make initial contact with subjects. Within agencies, email availability was not a problem (all on the same network). Contacting private citizens is more difficult - for our licensing boards audit the department database had emails for around 10 percent of all licensees. Expect successful delivery rates for non-state email addresses to be lower due to changes in address or Internet service provider (we successfully delivered 69 percent of emails to licensees).
For both surveys, emails explained the purpose of the audit and survey questions and provided respondents with a URL link to the survey web page. Emails also included a password for accessing the survey page (see below). Respondents were given assurances that their responses would be treated in confidence.
Survey web page: Survey web pages were constructed by legislative branch IT staff. URL link took respondents to an access page where they were required to enter a password (supplied in email). Password security is necessary to prevent unauthorized access to the survey by people not included in the sample. IT staff monitored ISP addresses for respondents to identify multiple use by a single individual (this process does not provide absolute assurance – where greater assurance is necessary, respondents would need to be supplied with unique username identities prior to accessing the survey).
For the survey contacting private citizens, we attempted to limit the number of questions and keep the format simple. Research showed many unsolicited online surveys work on the principle that respondents will get bored/distracted fairly quickly and prefer to be able to complete the process in less than 5 minutes. Surveys for agency personnel can assume a greater level of commitment on the part of respondents and can be longer/more detailed.
Data analysis: Responses were downloaded to an Access database maintained by IT staff. The main benefit of using online surveys is avoidance of tedious and time-consuming data entry and analysis. Following the closing dates for the surveys, data was available more-or-less immediately for summarization and inclusion in work papers. If you are relying on the expertise of IT professionals to design and build the survey tool, communication is an important issue. We experienced some problems with data analysis on one of the surveys due to poor database design. Spending time explaining your analytical needs with IT staff should minimize problems with analysis of the data - this is especially true where you want to analyze data for multiple demographic groups within your sample or where you have a large number of questions and/or inter-related questions.
Response rates: For the survey of professional licensees, we received responses from approximately 21 percent of the individuals we contacted. Because the survey responses were intended for use primarily as background information, we decided this response rate was acceptable. Obviously, where survey responses are being used as primary audit evidence, a statistically valid and representative sample would need to be used.
For the agency personnel survey, we received responses from approximately 55 percent of the individuals we contacted. We established statistical validity based on responses from different agency divisions and according to staff roles/positions.
Overall observations: In the case of the survey of professional licensees, we would probably not have been able to obtain their input without using the online survey. For such a large and diverse group, using a traditional paper survey would have entailed a considerable effort and use of resources. Online surveys have proven to be a useful tool, allowing us to reach out to wider communities and complete audits with greater efficiency. We will certainly be making more use of online surveys in the future.
Julie Leung, Texas Office of the State Auditor
I have recently conducted an online survey. It's good to keep it short and sweet, and requires no more than 5 minutes to answer. If it's longer than that, you will end up with many incomplete surveys. Be sure IT gives you all the code tables if you will be the person tabulating the results.
The other thing to iron out is the bugs of the system. Occasionally users hit something and can't get back in the survey.
Rick Riggs, Kansas
We use SurveyMonkey (surveymonkey.com). It's fast and easy to construct surveys, it's cheap ($19.95 per month), and it's flexible (lots of options for survey distribution and analysis).
In addition, surveys with fewer than 10 questions and 100 respondents, and that don't employ skip logic or other special features, don't cost anything. I tend to use those for quick in-house surveys about things like possible training topics, or what features staff would like in their next computer.
Kevin Dooley, U.S. General Accounting Office
We've done about 270 web surveys over the last couple years... so we've had a pretty good experience. We have an entire support team that coaches authors on building their surveys and handles the database and web administration work that is also needed to deploy and run the web sites.
We've also developed our own web survey software because we couldn't find a commercial package that met all of our requirements. (We also share it with the public through our web site at http://www.gao.gov/qpl.)
Jody Hauer, Minnesota
For the past three years, Minnesota’s Office of the Legislative Auditor has used online questionnaires as part of its survey research in evaluations. Our motivation in doing so was largely twofold: 1) increase the efficiency and accuracy of survey research and 2) improve the experience for respondents as a way to encourage them to respond.
As anyone who has conducted survey research knows, the process is long and labor intensive. In our case, when respondents returned paper questionnaires, our clerical staff entered the responses into a database and evaluators checked for data-entry errors. With numerous questions and hundreds or thousands of respondents, such checking was time consuming. And although we caught many such errors, inevitably one would get by us but come to light later only when the results of our data analyses looked strange. Online questionnaires allow us to avoid inaccuracies. Plus, we avoid delays in turnaround time associated with U.S. mail deliveries.
Beyond that, some respondents’ hand writing was less than clear; we’d have to contact them to verify what they had written. With online questionnaires, users key in their answers, and although typos occur, we have not had a problem with illegible responses. In addition, our questionnaires have often required respondents with certain answers to skip a question or two and pick up again with later questions. Online questionnaires can be designed to automatically bring respondents to the next question that is appropriate for them to answer.
Improving respondents’ experience with our surveys was the second reason for using online questionnaires. We wanted high response rates. Any step to ease the burden of completing a questionnaire was viewed as an added incentive to respond. Online questionnaires shorten the time users need to respond and allow them the convenience of responding from their computer while avoiding the need for longhand responses.
Before deciding to use online questionnaires, however, think about your audience. Online options are most beneficial when the respondents are people who are familiar with computers and the Internet; when we tried surveying building–maintenance engineers about preventive maintenance, we learned that few of them spend a lot of time in front of a desktop computer. Offering them an online option was not much help to them.
Although online questionnaires may take less time during the data-entry phase, they require more time in the upfront planning stage. We learned about the need to involve our information technology staff early when planning our survey. IT staff structure the questionnaire in an html format and provide the programming needed to make electronic data submission possible. They also ensure security when questionnaires involve confidential data. We had to add time into our design phase for IT staff to complete their work.
Testing the online questionnaires is critical. We test them both from our office and from remote locations, using computers of varying speeds and different types of Internet browsers to make sure anyone who needs to can access the questionnaire. We also conduct “stress tests” -- pasting 20 pages of text into an open-ended response, for instance – to make sure the questionnaire can handle whatever respondents throw at it.
To help users, we design questionnaire access to be as straightforward and simple as possible. We send users a step-by-step instruction sheet. For anyone who needs help when using the online questionnaire, we offer a contact name, phone, and e-mail address.
During the time span that people are responding to the questionnaire, we periodically check to make sure the server has not crashed, the questionnaire remains accessible, and the database continues to receive the electronic responses. Such checks help avoid situations where users receive only a “page not available” error message when trying to access the questionnaire.
Greg Fugate, Colorado
Colorado has contracted to overhaul our office's web page, time keeping and project management system, and intranet. As part of the intranet project, our contractor has integrated an survey tool that allows us to create and send surveys online. The system also will capture the responses in a text file and export it to Excel, ACL, SPSS, etc. We have been testing the online survey tool for a while and still have some logistical issues to work out. Hopefully, we will see this up and running soon because the idea of conducting online surveys has gotten a very positive response from the staff.
I think one of the biggest issues with conducting online surveys is the fact that the survey is sent over e-mail which has implications for how we identify our population. Not everyone has e-mail. Then there are secondary issues with e-mail spam filters, etc. that may further prevent the survey from getting to its intended recipient. Although online surveys may not be any easier to administer at the front-end than mail or telephone surveys, their primary advantage for our office is that they will eliminate the need to budget audit hours for data entry because the data capture will occur as respondents complete the survey.
Jenny Wilhelm, Florida
We have been conducting online surveys for several years, and our experience has been positive, although there are a few caveats. We use online surveys for more traditional surveys (e.g., to get opinions from a population about a certain topic) as well as for other types of information gathering (e.g., collecting information from other states about a program or activity).
Most of our experience with online surveys has been positive. A major advantage of online surveys is that it largely eliminates data entry workload and error issues. Our response rates have been about the same or better than other survey modes we’ve used. We have had the occasional “dud” online survey where we misjudged the technical savvy of the population we were surveying or for whatever reason we didn’t get the response rate we were hoping for. We believe that in some cases the technology possibly hampered the response rate. On the flip side, we have been able to deal with problems quicker because we always get same-day responses from some of the people we’ve contacted to take the survey. If there is a problem, you will hear about it quickly and can quickly fix it. In contrast, in a mail questionnaire you may not know about a problem question until it is way too late to correct the problem.
As with any other type of survey mode, the up-front work is crucial. For example, 1) obtaining a correct and complete survey population (sometimes it’s not as easy to get email addresses as we thought), 2) honing the questionnaire to ask only what is necessary, and 3) making sure any technical glitches are worked out will keep you from wasting time dealing with data you didn’t need to collect and a survey effort hindered by technology. We notify potential respondents about a survey through email (although with spam issues, we may have to revisit this practice). It can take as long or longer to verify or correct wrong email addresses as it does for postal addresses or phone numbers. Keeping your survey as short as possible is important because even though you avoid data entry with online surveys, you have to resist asking more questions than necessary because you don’t have to enter the data. With online surveys, you can save some time and money on administration in that compared to a mail survey, you don’t have printing and mailing costs and a staff cost to prepare the surveys for mailing. But you can lose the time advantage when you have technical difficulties.
Components of a well-designed online survey
A lot of tips we would give about well-designed online surveys are the same things we would mention about any survey, but there are a few unique to an online environment.
• Long surveys seem even longer when they are online surveys. Scrolling through screen after endless screen can cause a respondent to get frustrated and not complete the survey.
• If your online survey is in html, there are format limitations that you need to be aware of. For example, you use tables to line up information in columns. If your question has a scale (e.g., not likely, somewhat likely, very likely) and it appears at the top of a table, it will not be visible to respondents as they scroll down to complete the question.
• We let the technology work for us with questions that ask the respondent to skip to a question based on a response. We include a link in the survey that skips to the question where they are directed.
• We sometimes use light-colored shading to distinguish one question from the next, as it helps the eye focus from one question to the next.
• We do not use any fancy graphics that would increase the survey download time. However, we use our office seal to make the survey look more official.
• We use a basic font such as Arial or Times New Roman so as to not distract from the survey purpose. Just because you have nifty-looking fonts doesn’t mean you should use them.
• We include contact information on the survey page, including an email address that is linked so respondents can click on it and email the analyst responsible for the survey if they have trouble or questions.
I would encourage people to read an article regarding online surveys that appeared in the Winter 2002 NLPES newsletter:
The author did a good job of spelling out several issues to consider when conducting online surveys.
Michael Battle, Louisiana
We attempted to do an online survey for an audit of the University of Louisiana System (2002). At that time, there were companies that offered online survey services. From what I remember, they were costly and somewhat complicated. Maybe a lot has been done since and more can be offered by individual companies.
Another thought that we played around with at the time was to create a website where survey respondents could post their marks. This website would also have the capability of disseminating data to a spreadsheet for calculation. There were people out there doing this type of thing. However, we consulted with our IT people and they said that security, time, and money were issues - we were not able to do an online survey in-house. Again, maybe this technology has come a long way and such services would not be so difficult to obtain.
We did do a survey of local school districts by email. However, you may run in to problems. For example, not all have email. In addition, there are some who have accounts that do not know how to operate email. Overall, this was easier than mailing surveys, but it is not hassle free.
Ethel Detch, Tennessee
We have not done any on-line surveys to date, although we may consider them in the near future. I think part of our reluctance stems from the lack of computers and computer expertise among the populations that we typically survey, such as school officials, many of whom are in very rural areas, and local government officials, such as court clerks. All of that has improved immensely in recent years, though, so we may try it soon. We have done some in-house surveys on-line within our department.
Jill Jensen, Connecticut
We have not done any online surveys to date. However, we are interested in trying them in the future once we research the technology and set aside time for some training.
We are very interested in hearing about others' experiences (e.g., how are response rates, how do response rates compare with mail surveys, are there any privacy or quality control issues, is inputting and analysis of results simplified, etc.) We'd also welcome suggestions for training/manuals and software.