Online Questionnaires and Surveys

The world is changing and so are your research plans and options. We’re answering questions you post on MethodSpace, and related online events. Find the whole Q & A series here, and post your own questions. The questions discussed in this post were asked at the webinar, When the Field is Online. The recording from this webinar, hosted by Nvivo, is available for viewing.

One question was posted about questionnaires as potential tools for collecting qualitative data online. I have expanded my response to include resources about surveys as well as questionnaires, used in qualitative, quantitative, or mixed methods studies. These selected open-access resources include sample chapters in SAGE texts, and journal articles.

SAGE Books about Questionnaires and Surveys

Designing Research Questionnaires for Business and Management Students (2015)by Yuksel Ekinci

Race-Racism-and-Research-Autosaved-1024x576.jpg

Sample chapter 1

Designing Quality Survey Questions (2018) by Sheila B. Robinson and Kimberly Firth Leonard

Sample chapters 1 and 4

Conducting Online Surveys Second Edition (2011) by Valerie M. Sue and Lois A. Ritter 

Sample chapters 1 and 2

Doing Surveys Online (2015) by Vera Toepoel 

Sample chapter 1

In addition to these books, see two short Little Quick Fix books about questionnaires by Helen Kara. No sample chapters are available.

Open Access Articles

Note: These articles are only available for open access through the links provided here.

Ball, H. L. (2019). Conducting Online Surveys. Journal of Human Lactation, 35(3), 413–417. DOI/10.1177/0890334419848734

https://journals.sagepub.com/stoken/default+domain/10.1177%2F0890334419848734/full

Abstract. There is an established methodology for conducting survey research that aims to ensure rigorous research and robust outputs. With the advent of easy-to-use online survey platforms, however, the quality of survey studies has declined. This article summarizes the pros and cons of online surveys and emphasizes the key principles of survey research, for example questionnaire validation and sample selection. Numerous texts are available to guide researchers in conducting robust survey research online, however this is neither a quick nor easy undertaking. While online survey websites and software are useful for assisting in questionnaire design and delivery, they can also introduce sources of bias. Researchers considering conducting online surveys are encouraged to read carefully about how the principles of survey research can be applied to online formats in order to reduce bias and enhance rigor.

Behr, D., Kaczmirek, L., Bandilla, W., & Braun, M. (2012). Asking Probing Questions in Web Surveys: Which Factors have an Impact on the Quality of Responses? Social Science Computer Review, 30(4), 487–498. https://doi.org/10.1177/0894439311435305

Abstract. Cognitive interviewing is a well-established method for evaluating and improving a questionnaire prior to fielding. However, its present implementation brings with it some challenges, notably in terms of small sample sizes or the possibility of interviewer effects. In this study, the authors test web surveys through nonprobability online panels as a supplemental means to implement cognitive interviewing techniques. The overall goal is to tackle the above-mentioned challenges. The focus in this article is on methodological features that pave the way for an eventual successful implementation of category-selection probing in web surveys. The study reports on the results of 1,023 respondents from Germany. In order to identify implementation features that lead to a high number of meaningful answers, the authors explore the effects of (1) different panels, (2) different probing variants, and (3) different numbers of preceding probes on answer quality. The overall results suggest that category-selection probing can indeed be implemented in web surveys. Using data from two panels—a community panel where members can actively get involved, for example, by creating their own polls, and a “conventional” panel where answering surveys is the members' only activity—the authors find that high community involvement does not increase the likelihood to answer probes or produce longer statements. Testing three probing variants that differ in wording and provided context, the authors find that presenting the context of the probe (i.e., the probed item and the respondent’s answer) produces a higher number of meaningful answers. Finally, the likelihood to answer a probe decreases with the number of preceding probes. However, the word count of those who eventually answer the probes slightly increases with an increasing number of probes.

Brosnan, K., Kemperman, A., & Dolnicar, S. (2019). Maximizing participation from online survey panel members. International Journal of Market Research.

DOI/10.1177/1470785319880704

https://journals.sagepub.com/stoken/default+domain/10.1177%2F1470785319880704/full

Abstract. Low survey participation from online panel members is a key challenge for market and social researchers. We identify 10 key drivers of panel members’ online survey participation from a qualitative study and then determine empirically using a stated choice experiment the relative importance of each of those drivers at aggregate and segment levels. We contribute to knowledge on survey participation by (a) eliciting key drivers of survey participation by online panel members, (b) determining the relative importance of each driver, and (c) accounting for heterogeneity across panel members in the importance assigned to drivers. Findings offer immediate practical guidance to market and social researchers on how to increase participation in surveys using online panels.

Buchanan, E. A., & Hvizdak, E. E. (2009). Online Survey Tools: Ethical and Methodological Concerns of Human Research Ethics Committees. Journal of Empirical Research on Human Research Ethics, 4(2), 37–48. DOI/10.1525/jer.2009.4.2.37

https://journals.sagepub.com/stoken/default+domain/10.1525%2FJER.2009.4.2.37/full

Abstract. A survey of 750 university human Research Ethics Boards (HRECs) in the United States revealed that Internet research protocols involving online or Web surveys are the type most often reviewed (94% of respondents), indicating the growing prevalence of this methodology for academic research. Respondents indicated that the electronic and online nature of these survey data challenges traditional research ethics principles such as consent, risk, privacy, anonymity, confidentiality, and autonomy, and adds new methodological complexities surrounding data storage, security, sampling, and survey design. Interesting discrepancies surfaced among respondents regarding strengths and weaknesses within extant guidelines, which are highlighted throughout the paper. The paper concludes with considerations and suggestions towards consistent protocol review of online surveys to ensure appropriate human subjects protections in the face of emergent electronic tools and methodologies.

Decorte, T., Malm, A., Sznitman, S. R., Hakkarainen, P., Barratt, M. J., Potter, G. R., Asmussen Frank, V. (2019). The challenges and benefits of analyzing feedback comments in surveys: Lessons from a cross-national online survey of small-scale cannabis growers. Methodological Innovations. https://doi.org/10.1177/2059799119825606

Abstract. It is common practice in survey questionnaires to include a general open and non-directive feedback question at the end, but the analysis of this type of data is rarely discussed in the methodological literature. While these open-ended comments can be useful, most researchers fail to report on this issue. The aim of this article is to illustrate and reflect upon the benefits and challenges of analyzing responses to open-ended feedback questions. The article describes the experiences of coding and analyzing data generated through a feedback question at the end of an international online survey with small-scale cannabis cultivators carried out by the Global Cannabis Cultivation Research Consortium. After describing the design and dataset of the web survey, the analytical approach and coding frame are presented. The analytical strategies chosen in this study illustrate the diversity and complexity of feedback comments which pose methodological challenges to researchers wishing to use them for data analyses. In this article, three types of feedback comments (political/policy comments, general comments of positive and negative appreciation, and methodological comments) are used to illustrate the difficulties and advantages of analyzing this type of data. The advantages of analyzing feedback comments are well known, but they seem to be rarely exploited. General feedback questions at the end of surveys are typically non-directive. If researchers want to use these data for research and analyses, they need a clear strategy. They ought to give enough thought to why they are including this type of question, and develop an analytical strategy at the design stage of the study.

Mei, B., & Brown, G. T. L. (2018). Conducting Online Surveys in China. Social Science Computer Review, 36(6), 721–734. DOI/10.1177/0894439317729340

https://journals.sagepub.com/stoken/default+domain/10.1177%2F0894439317729340/full

Abstract. Using online surveys is becoming increasingly extensive and widespread. Social science research in China is no exception. However, due to contextual factors (e.g., technological constraints, social and cultural norms, and language barriers), prior successful methods may not apply. This article reports an alternative way of conducting online surveys in China, by combining local commercial online survey service providers with indigenous Web 2.0 applications. The case study demonstrates the feasibility of this approach and provides practical advice (e.g., adding incentives) on how to effectively conduct online survey in China.

Wells, T., Bailey, J. T., & Link, M. W. (2014). Comparison of Smartphone and Online Computer Survey Administration. Social Science Computer Review, 32(2), 238–255. DOI/10.1177/0894439313505829

https://journals.sagepub.com/stoken/default+domain/10.1177%2F0894439313505829/full

Abstract. The dramatic rise of smartphones has profound implications for survey research. Namely, can smartphones become a viable and comparable device for self-administered surveys? The current study is based on approximately 1,500 online U.S. panelists who were smartphone users and who were randomly assigned to the mobile app or online computer mode of a survey. Within the survey, we embedded several experiments that had been previously tested in other modes (mail, PC web, mobile web). First, we test whether responses in the mobile app survey are sensitive to particular experimental manipulations as they are in other modes. Second, we test whether responses collected in the mobile app survey are similar to those collected in the online computer survey. Our mobile survey experiments show that mobile survey responses are sensitive to the presentation of frequency scales and the size of open-ended text boxes, as are responses in other survey modes. Examining responses across modes, we find very limited evidence for mode effects between mobile app and PC web survey administrations. This may open the possibility for multimode (mobile and online computer) surveys, assuming that certain survey design recommendations for mobile surveys are used consistently in both modes.

Wieters, K. M. (2016). Advantages of Online Methods in Planning Research: Capturing Walking Habits in Different Built Environments. SAGE Open. https://doi.org/10.1177/2158244016658082

Abstract. This article examines the effectiveness of using online survey methods in planning research. This study measured travel behavior and physical activity of office workers over a month period. An online travel diary, pedometer, and online survey were used to assess walking levels and transportation habits for office workers. A subset of the sample used a paper travel diary, which was used for comparison. Analysis of missing data was performed to explore implications of using online survey methods. Using online travel diaries and surveys to assess objective and subjective data can help to reduce recall bias, missing data, and greater flexibility survey administration.

Previous
Previous

Interdisciplinary Collage: Interview with Dr. Suzanne Culshaw

Next
Next

Participant Observation: How does it work online?