Design Studies with Online Surveys

by Janet Salmons, PhD, Research Community Manager for SAGE Methodspace

Research design is the focus for the first quarter of 2023. Find the unfolding series here. Want to know more about survey research in particular? See these recent posts.


Open-Access Articles from SAGE Journals

Online surveys, or those completed on an app, have largely replaced those completed with pen and paper. These articles offer advice and recommendations for researchers specific to the design of studies that involve use of electronic surveys.

Brosnan, K., Kemperman, A., & Dolnicar, S. (2021). Maximizing participation from online survey panel members. International Journal of Market Research, 63(4), 416–435. https://doi.org/10.1177/1470785319880704

Abstract. Low survey participation from online panel members is a key challenge for market and social researchers. We identify 10 key drivers of panel members’ online survey participation from a qualitative study and then determine empirically using a stated choice experiment the relative importance of each of those drivers at aggregate and segment levels. We contribute to knowledge on survey participation by (a) eliciting key drivers of survey participation by online panel members, (b) determining the relative importance of each driver, and (c) accounting for heterogeneity across panel members in the importance assigned to drivers. Findings offer immediate practical guidance to market and social researchers on how to increase participation in surveys using online panels.

Liu, M. (2020). Soliciting email addresses to re-contact online survey respondents: Results from web experiments. Methodological Innovations, 13(2). https://doi.org/10.1177/2059799120937237

Abstract. There are many occasions where contact information needs to be collected from survey participants in order to achieve future contacts and conducting follow-up surveys. This article reports findings from two experiments into collecting respondent emails and sending the second survey invites. In the email collection experiment, when only one follow-up survey was mentioned, more respondents provided their emails, compare to when the emphasis was on the research purpose of the follow-up survey. However, the follow-up survey participation rates are similar among respondents who provided their emails regardless of the wording of the request. The invitation email subject line experiment shows that a generic requesting for opinion reduces the follow-up survey participation compared to the elements emphasizing survey sponsor and specialty opinions.

Marpsat, M., & Razafindratsima, N. (2010). Survey Methods for Hard-to-Reach Populations: Introduction to the Special Issue. Methodological Innovations Online, 5(2), 3–16. https://doi.org/10.4256/mio.2010.0014

Abstract. Surveys of hard to reach populations (rare, no known sampling frames) have been, for some years, the object of methodological reflection. Various methods aiming at the production of an ‘extrapolable’ sample of these populations have been proposed: time-space sampling (TSS) or time-location sampling (TLS), respondent driven sampling (RDS), or the ‘capture - recapture’ method. After defining what a hard-to reach-population is, this article provides an outline of these various approaches before going on to briefly consider the papers contained in this special issue.

Mayer, A. (2021). Reducing respondents’ perceptions of bias in survey research. Methodological Innovations, 14(3). https://doi.org/10.1177/20597991211055952

Abstract. Response rates for surveys have declined steadily over the last few decades. During this period, trust in science has also waned and conspiratorial theorizing around scientific issues has seemingly become more prevalent. In our prior work, we found that a significant portion of respondents will claim that a given survey is “biased.” In this follow-up research, we qualify these perceptions of bias and point to potential causes and ameliorative mechanisms.

Muñoz van den Eynde, A., & Lobera, J. (2022). Analysis of the tendency to select the “neither nor” option in agreement/disagreement scales on a low – salience topic: The contribution of individual differences. Methodological Innovations, 15(3), 289–302. https://doi.org/10.1177/20597991221127407

Abstract. This paper examines the frequency of midpoint responses in agree/disagree scales in a survey measuring attitudes toward science and scientific policy, a low-salience and difficult topic. It also examines the contribution of individual differences to explain the tendency to select this option. It is assumed that the use of the midpoint “Neither agree nor disagree” (NA/ND) in Agree/Disagree (A/D) scales is to some extent an indication of satisficing. It is also assumed that there are individual differences in respondents’ tendency to select the NA/ND response. Using a Generalized Linear Mixed Model we include Krosnick’s regulators of satisficing, socio-demographics and individual differences as predictors. We find that the contribution of the regulators of satisficing identified by Krosnick is small. In turn, factors associated with individual differences explain a great amount of variance on the number of NA/ND responses. We conclude that the presence of this option in a survey of a low-salience topic increases satisficing as strategy of the respondents to deal with the cognitive burden of both the A/D scale and the difficulty of the topic. Furthermore, the number of NA/ND responses may be understood as an indicator of individual differences in the propensity of respondents to satisficing when answering a survey on a low-salience or difficult topic.

Ochoa, C., & Revilla, M. (2018). To what extent are members of an online panel willing to share different data types? A conjoint experiment. Methodological Innovations, 11(2). https://doi.org/10.1177/2059799118796017

Abstract. Recently, the idea of ‘data fusion’, that is, of combining different types of data, became quite popular because of the advances of new technologies. In particular, several studies started investigating the possibility of combining survey data with other data types in order to get a more complete or accurate picture of the reality and/or to reduce survey burden. One key element, then, is the willingness of people to share different types of data, beyond survey answers. In this article, we investigate to what extent members from an opt-in online panel in Spain are willing to share different types of information that have in general not been studied before in the literature: records of their surrounding sound (audiotracking), information from their email inbox (in different ways, sharing the email credentials, using an email plug-in or redirecting emails, partially or totally), sensorial reactions measured by a wearable device (neuroscience) and public information about them available online. We use a choice-based conjoint analysis in order to study the level of willingness depending on the incentives offered in exchange, and we present the level of willingness by gender and age groups. Overall, we find huge differences in the level of willingness across data types. Increasing the incentives, on the contrary, does not improve the willingness so much, even if there is a positive trend. Some differences are observed across gender and age groups but most of them are not statistically significant.

Revilla, M., Paura, E., & Ochoa, C. (2021). Use of a research app in an online opt-in panel: The Netquest case. Methodological Innovations, 14(1). https://doi.org/10.1177/2059799120985373

Abstract. The increasing use of mobile devices in the frame of online surveys has been accompanied by the development of research apps. These research apps have the potential to facilitate the process for respondents (e.g. being able to complete surveys when Internet is not available provides more freedom on when and where participants can participate) and fieldwork companies (e.g. the possibility to use push notifications could lead to higher participation rates). However, previous research suggests that panelists may also be reluctant to install an app. In this study, we answer research questions related to the knowledge and use of the Netquest app. We found that a majority of panelist did not know about the app and although sending invitations significantly increased its installation, the overall total of respondents installing the app remained low. Furthermore, the profile of those who installed the app differs from those who did not. The participation of panelists after they installed the app seems stable. The main reason for installing the app is comfort while the main reason for not installing relates to space/battery usage. Most of those who did not install could accept to install the app.

Sakshaug, J. W., Vicari, B., & Couper, M. P. (2019). Paper, E-mail, or Both? Effects of Contact Mode on Participation in a Web Survey of Establishments. Social Science Computer Review, 37(6), 750–765. https://doi.org/10.1177/0894439318805160

Abstract. Identifying strategies that maximize participation rates in population-based web surveys is of critical interest to survey researchers. While much of this interest has focused on surveys of persons and households, there is a growing interest in surveys of establishments. However, there is a lack of experimental evidence on strategies for optimizing participation rates in web surveys of establishments. To address this research gap, we conducted a contact mode experiment in which establishments selected to participate in a web survey were randomized to receive the survey invitation with login details and subsequent reminder using a fully crossed sequence of paper and e-mail contacts. We find that a paper invitation followed by a paper reminder achieves the highest response rate and smallest aggregate nonresponse bias across all-possible paper/e-mail contact sequences, but a close runner-up was the e-mail invitation and paper reminder sequence which achieved a similarly high response rate and low aggregate nonresponse bias at about half the per-respondent cost. Following up undeliverable e-mail invitations with supplementary paper contacts yielded further reductions in nonresponse bias and costs. Finally, for establishments without an available e-mail address, we show that enclosing an e-mail address request form with a prenotification letter is not effective from a response rate, nonresponse bias, and cost perspective.

Stoycheff, E. (2016). Please participate in Part 2: Maximizing response rates in longitudinal MTurk designs. Methodological Innovations, 9. https://doi.org/10.1177/2059799116672879

Abstract. The ease and affordability of Amazon’s Mechanical Turk make it ripe for longitudinal, or panel, study designs in social science research. But the discipline has not yet investigated how incentives in this “online marketplace for work” may influence unit non-response over time. This study tests classic economic theory against social exchange theory and finds that despite Mechanical Turk’s transactional nature, expectations of reciprocity and social contracts are important determinants of participating in a study’s Part 2. Implications for future research are discussed.

Toepoel, V., Mathon, K., Tussenbroek, P., & Lugtig, P. (2021). Probing in online mixed-device surveys: Is a research messenger layout more effective than a traditional online layout, especially on mobile devices? Bulletin of Sociological Methodology/Bulletin de Méthodologie Sociologique, 151(1), 74–95. https://doi.org/10.1177/07591063211019953

Abstract. This article compares the effectiveness of a research messenger layout to a traditional online layout with regards to probing. Responses to different types of probes (explanation, elaboration and category selection probes) were examined in terms of length and quality, measured by number of characters, number of themes, and an indicator for response quality. The research messenger layout, regardless of device being used, had a negative effect on both response length, number of themes and response quality. Further, we found that in both the traditional and research messenger layout, using a mobile device negatively affects the number of characters and themes used in probed responses. We conclude that probing is most effective when a traditional survey is completed on a computer. The research messenger layout was not able to generate responses of similar quality compared to the traditional layout, regardless of device being used.

Van Quaquebeke, N., Salem, M., van Dijke, M., & Wenzel, R. (2022). Conducting organizational survey and experimental research online: From convenient to ambitious in study designs, recruiting, and data quality. Organizational Psychology Review, DOI.20413866221097571.

Abstract. Conducting organizational research via online surveys and experiments offers a host of advantages over traditional forms of data collection when it comes to sampling for more advanced study designs, while also ensuring data quality. To draw attention to these advantages and encourage researchers to fully leverage them, the present paper is structured into two parts. First, along a structure of commonly used research designs, we showcase select organizational psychology (OP) and organizational behavior (OB) research and explain how the Internet makes it feasible to conduct research not only with larger and more representative samples, but also with more complex research designs than circumstances usually allow in offline settings. Subsequently, because online data collections often also come with some data quality concerns, in the second section, we synthesize the methodological literature to outline three improvement areas and several accompanying strategies for bolstering data quality.

Wieters, K. M. (2016). Advantages of Online Methods in Planning Research: Capturing Walking Habits in Different Built Environments. SAGE Open, 6(3). https://doi.org/10.1177/2158244016658082

This article examines the effectiveness of using online survey methods in planning research. This study measured travel behavior and physical activity of office workers over a month period. An online travel diary, pedometer, and online survey were used to assess walking levels and transportation habits for office workers. A subset of the sample used a paper travel diary, which was used for comparison. Analysis of missing data was performed to explore implications of using online survey methods. Using online travel diaries and surveys to assess objective and subjective data can help to reduce recall bias, missing data, and greater flexibility survey administration.

Young, A., Espinoza, F., Dodds, C., Rogers, K., & Giacoppo, R. (2021). Adapting an Online Survey Platform to Permit Translanguaging. Field Methods, 33(4), 388–404. https://doi.org/10.1177/1525822X21993966

Abstract. This article concerns online data capture using survey methods when the target population(s) comprise not just of several different language-using groups, but additionally populations who may be multilingual and whose total language repertoires are commonly employed in meaning-making practices—commonly referred to as translanguaging. It addresses whether current online data capture survey methods adequately respond to such population characteristics and demonstrates a worked example of how we adapted one electronic data capture software platform (REDCap) to present participants with not just multilingual but translanguaging engagement routes that also encompassed multimodal linguistic access in auditory, orthographic, and visual media. The study population comprised deaf young people. We share the technical (coding) adaptations made and discuss the relevance of our work for other linguistic populations.


More Methodspace Posts about Research Design

Previous
Previous

Designing Literature Reviews as a Research Project

Next
Next

Publish Your Doctoral Research