Collect Data with Online Surveys

By Janet Salmons, Ph.D.

Dr. Salmons is the author of Doing Qualitative Research Online, and Gather Your Data Online. Use the code COMMUNIT24 for 25% off through December 31, 2024 if you purchase research books from Sage.


Introduction to web surveys

When survey tools were first available online, it seemed like the process of data collection would get much easier. No paper to mail, automated compilation and analytic tools, ability to use computers as well as mobile devices are factors that demonstrate the advantages. Naturally, as researchers started to conduct survey research online, new opportunities and challenges became apparent.

What is a survey? Allen (2017) defined this research method as:

A survey is a set of questions or statements to which participants give responses. A survey provides one of the best methods to obtain a large amount of data from participants. In this manner, survey data can provide a quantitative, qualitative, and/or numeric description of trends, attitudes, or opinions of a population by studying a sample of that population. 

While research terms are often used differently across disciplinary and methodological contexts, I use the term survey to describe a way to collect data for quantitative research, using a tested instrument. I use the term questionnaire to describe a way to collect data for qualitative studies based on questions devised by the researcher. By this definition questionnaires can include more open-ended questions that invite narrative responses, while surveys include more check-box and Likert scale options. The definition from Groves et al. (2004) aligns with this way of thinking:

 A survey is a systematic method for gathering information from (a sample of) entities for the purpose of constructing quantitative descriptors of the attributes of the large population of which the entities are members (p. 2).

This excerpt from an entry in the SAGE Research Methods Foundations by Stern, LeClere & Fordyce (2019) discusses surveys conducted online:

Since the early 2000s, surveys completed via the Internet have become an important part of the data collection lexicon. It is the case that web surveys are now used for large multimillion-dollar survey efforts with tens of thousands of survey units as well as for small-scale evaluation and appraisal work. While Internet use is ubiquitous, the best practices for designing, implementing, and administering web surveys still remain in flux because of the evolving nature of technology use as well as rapid changes in the technology itself. The majority of respondents no longer simply use desktop computers tied to a fixed position to navigate through a survey hosted on the Internet. Surveys are now completed on smartphones, tablets, laptops using a mouse, touch screen, voice, or linked device. The design process thus also requires designing the new as well as extant technologies.

As a result of this rapid evolution, there is uncertainty surrounding the very definition of what constitutes a web survey; that is, with the multitude of ways in which researchers can deliver surveys that are technically “over the web,” the web survey has been transformed from a mode of data collection to a “multidimensional” concept (Couper, 2011, p. 892). For the purposes of this discussion, a web survey is defined as a survey whereby respondents can be reached and complete a questionnaire via location-based broadband Internet access, Internet-enabled mobile devices, social media, or text messaging.

Groves R.M., Fowler F.J., Couper M.P., Lepkowski J.M., Singer E. and Tourangeau R. (2004) Survey Methodology, Hoboken: Wiley.

Stern, M. J., LeClere, F., & Fordyce, E. (2019). Web Surveying Design and Implementation. In P. Atkinson, S. Delamont, A. Cernat, J.W. Sakshaug, & R.A. Williams (Eds.), SAGE Research Methods Foundations. https://www.doi.org/10.4135/9781526421036827920


Open-access articles about survey methods

In this collection of open-access articles researchers discuss how they conducted survey research, including these topics:

  • Designing the survey

  • Improving the layout

  • Understanding limitations

  • Maximizing attention and completion

  • Avoiding fraud

  • Recruiting participants

  • Including participants who speak multiple languages

  • Tracking use of multiple devices

  • Using video

Andrade C. The Limitations of Online Surveys. Indian Journal of Psychological Medicine. 2020;42(6):575-576. doi:10.1177/0253717620957496

Abstract. Online surveys are growing in popularity, perhaps because they are an easy, convenient, and inexpensive means of data collection. Online surveys commonly suffer from two serious methodological limitations: the population to which they are distributed cannot be described, and respondents with biases may select themselves into the sample. Research is of value only when the findings from a sample can be generalized to a meaningful population. When the population addressed by the survey cannot be described, and when the sample is contaminated by respondents with biases, findings from online surveys cannot be generalized and may therefore mislead.

Brosnan, K., Kemperman, A., & Dolnicar, S. (2021). Maximizing participation from online survey panel members. International Journal of Market Research, 63(4), 416-435. https://doi.org/10.1177/1470785319880704

Abstract. Low survey participation from online panel members is a key challenge for market and social researchers. We identify 10 key drivers of panel members’ online survey participation from a qualitative study and then determine empirically using a stated choice experiment the relative importance of each of those drivers at aggregate and segment levels. We contribute to knowledge on survey participation by (a) eliciting key drivers of survey participation by online panel members, (b) determining the relative importance of each driver, and (c) accounting for heterogeneity across panel members in the importance assigned to drivers. Findings offer immediate practical guidance to market and social researchers on how to increase participation in surveys using online panels.

Decorte, T., Malm, A., Sznitman, S. R., Hakkarainen, P., Barratt, M. J., Potter, G. R., Werse, B., Kamphausen, G., Lenton, S., & Asmussen Frank, V. (2019). The challenges and benefits of analyzing feedback comments in surveys: Lessons from a cross-national online survey of small-scale cannabis growers.Methodological Innovationshttps://doi.org/10.1177/2059799119825606

Abstract. It is common practice in survey questionnaires to include a general open and non-directive feedback question at the end, but the analysis of this type of data is rarely discussed in the methodological literature. While these open-ended comments can be useful, most researchers fail to report on this issue. The aim of this article is to illustrate and reflect upon the benefits and challenges of analyzing responses to open-ended feedback questions. The article describes the experiences of coding and analyzing data generated through a feedback question at the end of an international online survey with small-scale cannabis cultivators carried out by the Global Cannabis Cultivation Research Consortium. After describing the design and dataset of the web survey, the analytical approach and coding frame are presented. The analytical strategies chosen in this study illustrate the diversity and complexity of feedback comments which pose methodological challenges to researchers wishing to use them for data analyses. In this article, three types of feedback comments (political/policy comments, general comments of positive and negative appreciation, and methodological comments) are used to illustrate the difficulties and advantages of analyzing this type of data. The advantages of analyzing feedback comments are well known, but they seem to be rarely exploited. General feedback questions at the end of surveys are typically non-directive. If researchers want to use these data for research and analyses, they need a clear strategy. They ought to give enough thought to why they are including this type of question, and develop an analytical strategy at the design stage of the study.

He, Q., & Li, Y. (2023). Civic Engagement Intention and the Data-Driven Fan Community: Investigating the Motivation Behind Chinese Fans’ Online Data-Making Behavior From a Collective Action Perspective. Social Media + Society, 9(1). https://doi.org/10.1177/20563051221150409

Abstract. Enabled by social media, the data frenzy in the data-driven fandom culture in China has attracted widespread attention. Unlike most forms of data labor and fan activities, Chinese fans’ online data-making behavior (zuoshuju) appears tedious, time- and money-consuming, and overwhelmingly irrational. Therefore, this study aimed to examine the sociopsychological motivation of fans’ online data-making behavior from a collective action perspective. Based on survey data from 588 respondents with fandom experiences online in China, this study (1) distinguished two types of online data-making (operational and monetary); (2) suggested that celebrity worship and civic engagement intention were antecedents of online data-making; and (3) found that fan communities facilitated by social media bridged the effects of sociopsychological factors and data-making behavior. This research introduced the collective action perspective and constructed a quantitative path model to test the underlying mechanism and impetus of fans’ data-making practices in China, adding quantitative support to the knowledge of the hybrid pattern of collective actions embedded in the datafication world. It contributes to the understanding of Chinese youth culture and civic engagement through social media.

Kalleitner, F., Mühlböck, M., & Kittel, B. (2022). What’s the Benefit of a Video? The Effect of Nonmaterial Incentives on Response Rate and Bias in Web Surveys. Social Science Computer Review, 40(3), 700-716. https://doi.org/10.1177/0894439320918318

Abstract. Traditional survey research faces declining response rates due to changing cultural habits and technological developments. Researchers have developed novel approaches to increase respondents’ likelihood of participating in web surveys. However, we lack information about whether these methods indeed increase response rates and, if so, whether they bias the resulting data. This article focuses on the use of nonmaterial incentives in the form of a video that provides the invitees with information tailored to their life situation. Analysis of our experimental data shows that instead of increasing respondents’ probability of starting the survey, the video treatments actually decrease it. We provide evidence that the lower salience of the intrinsic benefits of survey participation in the invitation email significantly contributes to this reduction. Additionally, the effect of the nonmaterial incentive differs across subgroups, affecting nonresponse biases in line with employment status, gender, and migration background. We therefore conclude that using additional information in the form of a video as a nonmaterial survey incentive is only suitable under specific conditions.

Mancosu, M., Ladini, R., & Vezzoni, C. (2019). ‘Short is Better’. Evaluating the Attentiveness of Online Respondents Through Screener Questions in a Real Survey Environment. Bulletin of Sociological Methodology/Bulletin de Méthodologie Sociologique, 141(1), 30-45. https://doi.org/10.1177/0759106318812788

Abstract. In online surveys, the control of respondents is almost absent: for this reason, the use of screener questions or “screeners” has been suggested to evaluate respondent attention. Screeners ask respondents to follow a certain number of instructions described in a text that contains a varying amount of misleading information. Previous work focused on ad-hoc experimental designs composed of a few questions, generally administered to small samples. Using an experiment inserted into an Italian National Election Study survey (N=3,000), we show that short screeners – namely, questions with a reduced amount of misleading information – should be preferred to longer screeners in evaluating the attentiveness of respondents. We also show there is no effect of screener questions in activating respondent attention.

Toepoel, V., Mathon, K., Tussenbroek, P., & Lugtig, P. (2021). Probing in online mixed-device surveys: Is a research messenger layout more effective than a traditional online layout, especially on mobile devices? Bulletin of Sociological Methodology/Bulletin de Méthodologie Sociologique, 151(1), 74–95. https://doi.org/10.1177/07591063211019953

Abstract. This article compares the effectiveness of a research messenger layout to a traditional online layout with regards to probing. Responses to different types of probes (explanation, elaboration and category selection probes) were examined in terms of length and quality, measured by number of characters, number of themes, and an indicator for response quality. The research messenger layout, regardless of device being used, had a negative effect on both response length, number of themes and response quality. Further, we found that in both the traditional and research messenger layout, using a mobile device negatively affects the number of characters and themes used in probed responses. We conclude that probing is most effective when a traditional survey is completed on a computer. The research messenger layout was not able to generate responses of similar quality compared to the traditional layout, regardless of device being used.

Tong, Q., Cui, J., & Ren, B. (2023). Space Connected, Emotion Shared: Investigating Users of Digital Chinese Cultural Heritage. Emerging Media, 0(0). https://doi.org/10.1177/27523543231216774

Abstract. Empowered by the Internet and digital technology, Chinese cultural heritage can intuitively convey its historical stories and share its esthetic art with online users via “social  +  intelligence  +  sharing.” Using a survey (N  =  783) and semi structured interview (N  =  20), adapting and extending the theoretical framework of the third space and interactive ritual chain, this paper analyzes online users’ perceptions, emotions, and actions toward digital Chinese cultural heritage. The results indicate that the technology-enabled communication of digital cultural heritage can construct a kind of third cultural space (immersion & interculturality) where the interactive ritual chain can continuously drive and superimpose online users’ emotional energy. Such emotional energy is a kind of common human emotion that can transcend nationalities. During this process, traditional Chinese culture, popular culture, and global culture are connected.

Young, A., Espinoza, F., Dodds, C., Rogers, K., & Giacoppo, R. (2021). Adapting an Online Survey Platform to Permit Translanguaging. Field Methods, 33(4), 388-404. https://doi.org/10.1177/1525822X21993966

Abstract. This article concerns online data capture using survey methods when the target population(s) comprise not just of several different language-using groups, but additionally populations who may be multilingual and whose total language repertoires are commonly employed in meaning-making practices—commonly referred to as translanguaging. It addresses whether current online data capture survey methods adequately respond to such population characteristics and demonstrates a worked example of how we adapted one electronic data capture software platform (REDCap) to present participants with not just multilingual but translanguaging engagement routes that also encompassed multimodal linguistic access in auditory, orthographic, and visual media. The study population comprised deaf young people. We share the technical (coding) adaptations made and discuss the relevance of our work for other linguistic populations.

Van Quaquebeke, N., Salem, M., van Dijke, M., & Wenzel, R. (2022). Conducting organizational survey and experimental research online: From convenient to ambitious in study designs, recruiting, and data quality. Organizational Psychology Review, 12(3), 268-305. https://doi.org/10.1177/20413866221097571

Abstract. Conducting organizational research via online surveys and experiments offers a host of advantages over traditional forms of data collection when it comes to sampling for more advanced study designs, while also ensuring data quality. To draw attention to these advantages and encourage researchers to fully leverage them, the present paper is structured into two parts. First, along a structure of commonly used research designs, we showcase select organizational psychology (OP) and organizational behavior (OB) research and explain how the Internet makes it feasible to conduct research not only with larger and more representative samples, but also with more complex research designs than circumstances usually allow in offline settings. Subsequently, because online data collections often also come with some data quality concerns, in the second section, we synthesize the methodological literature to outline three improvement areas and several accompanying strategies for bolstering data quality.


Learn more about survey research with these Sage books and author interviews.

Use the code COMMUNIT24 for 25% off through December 31, 2024.

Laura Wilson and Emma Dickinson discuss Respondent Centred Surveys.

Jan Eichhorn explains some key steps helpful to any researcher considering the use of surveys in quantitative or mixed methods research.


Sage Research Methods Community posts about survey research


Previous
Previous

Anonymizing Qualitative Data

Next
Next

Design and Data Collection with Julianne Cheek and Elise Øby