Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session
A 4: Device Effects
Time:
Friday, 11/Sep/2020:
10:00 - 11:20

Session Chair: Bella Struminskaya, Utrecht University, Netherlands, The

Show help for 'Increase or decrease the abstract text size'
Presentations

Layout and Device Effects on Breakoff Rates in Smartphone Surveys: A Systematic Review and a Meta-Analysis

Mirjan Schulz1, Bernd Weiß1, Aigul Mavletova2, Mick P. Couper3

1GESIS Leibniz Institute for the Social Sciences, Germany; 2Higher School of Economics (HSE) Moscow, Russia; 3Michigan Population Studies Center (PSC), United States of America

Relevance & Research Question: Online survey participants increasingly complete questionnaires on their smartphones. However, a common finding in survey research is that survey respondents using mobile devices break off more often than participants using a computer. Previous research has revealed numerous aspects that potentially affect the breakoff rates. These aspects can be divided into two sections: layout features and survey related conditions. Layout features are, e.g., screen-optimized designs, ecessities to scroll, and matrix questions. The survey related conditions involve the invitation mode, reminders, compulsion for a certain device, etc. So far, the literature shows heterogeneous influences of these effects on breakoff rates. This brings us to our research question: How effective are different measures of optimizing surveys for smartphones to reduce breakoff rates of smartphone respondents?

Methods & Data: To answer this question, we collected research results regarding measurement on smartphone optimization and device effects from more than 50 papers and a variety of conference presentations published between 2007 and August 2019. By conducting a systematic review and a meta-analysis, we tested which of these predictors lower the breakoff rates in mobile web surveys. We hypothesize that mobile optimized surveys are more user-friendly, which in turn increases survey enjoyment and lowers survey burden. Consequently, lowering the survey burden leads to lower breakoff rates. We aim to examine which measures are helpful to optimize surveys for mobile devices.

Results & added Value: Based on our findings, we will present best practices from the current state of research to sustainably reduce breakoff rates in mobile web surveys. We build upon earlier findings of a meta-analysis from Mavletova and Couper (2015), add new empirical evidence, and expand their analytical framework. Our preliminary results so far show that a smartphone-optimized layout decreases breakoff rates. The final results will be available at the beginning of 2020.



Samply: A user-friendly web and smartphone application for conducting experience sampling studies

Yury Shevchenko1, Tim Kuhlmann1,2, Ulf-Dietrich Reips1

1University of Konstanz, Germany; 2University of Siegen, Germany

Relevance & Research Question:

Running an experience sampling study via smartphones is a complex undertaking. Scheduling and sending mobile notifications to participants is a tricky task because it requires the use of native mobile applications. In addition, the existing software solutions often restrict the number of possible question types. To solve these problems, we have developed a free web application that runs in any browser and can be installed on mobile phones. Using the application, researchers can create their studies, schedule notifications, and monitor users' reactions. The content of notifications is fully customizable and may include links to studies created with external survey services.

Methods & Data:

We have conducted several empirical studies to test the application and its features, such as creating different types of notifications schedules and logging participants’ interactions with notifications. First pilot testing was carried out in student projects that conducted different surveys (e.g. happiness, stress, sleep quality, dreaming) with a schedule from several days up to one week. The second study was our own experience sampling survey with a university sample that was completed during one week with notifications sent seven times a day in the two-hours intervals. We also plan a third study with online samples, the results of which will be presented at the conference.

Results:

In the first pilot study (8 projects, n = 63), we analyzed the response rate of the participants based on the logging of interactions with notifications. In addition, the design and functionality of the web application was improved following a usability survey with application users. In the second study (n = 23) we analyzed how the type of participant’s device (i.e., mobile phone) is related to the response rate. Additionally, we investigated the relationship between the interaction with notifications and the response rate in the experience sampling survey. In the third study, we plan to repeat the analysis for the sample recruited online.

Added Value:

Our application provides a direct and easy way to run experience sampling studies. It has an open-source code and is available at https://samply.js.org.



The effect of layout and device on measurement invariance in web surveys

Ines Schaurer1, Katharina Meitinger2, David Bretschi1

1GESIS Leibniz Institute for the Social Sciences, Germany; 2Utrecht Universit, The Netherlands

Relevance & Research:

As the majority of online surveys nowadays are mixed-device studies of personal desktop computers (PC) and smartphones, the layout needs to be adapted to both device types. A lot of well-established constructs are usually presented in the matrix format. However, matrixes are not recommended for the use in smartphone surveys. Therefore, matrix questions are a challenge for all mixed-device studies. So far, the majority of studies that investigate the effects of layout and device on data quality have focused on indicators such as nonresponse and satisficing strategies. In our experimental study we focus on the combined effect of devices and layouts on measurement invariance.

Methods & Data:

In an experimental study we assessed the comparability of different constructs across device and layout combinations. We varied the two factors device (desktop vs. mobile device) and layout (optimized for desktop vs. optimized for smartphones vs. build-in adaptive layout), resulting in six groups of layout-device combinations. We included 5 well-established constructs with different numbers of scale points that are usually presented in a matrix format.

In October 2018 respondents from an online access panel in Germany were randomly invited to one of the six experimental groups. We applied quota sampling regarding age, sex, and education. Overall 3096 respondents finished the survey.

The experimental design allows us to examine whether the different layout settings have an impact on the perceived range of response scales and the presentation of multiple question as one conceptional unit. We evaluate whether layout and device have an impact on mean levels and whether the latent constructs are comparable across groups by the means of structural equation modelling.

Results: We find that layout and device do not impact mean levels of the constructs and we find a high level of comparability across experimental groups (scalar invariance).

Added Value:

This study provides evidence on the effect of layout choices on measurement invariance, depending on the device used. Furthermore, it offers information about comparability of results in mixed-device studies and practical guidance for designing mixed-device studies.



Measuring respondents’ same-device multitasking through paradata

Tobias Baier, Marek Fuchs

TU Darmstadt, Germany

Relevance & Research Question: As a self-administered survey mode, Web surveys allow respondents to temporarily leave the survey page and switch to another window or browser tab. This form of sequential multitasking has the potential to disrupt the response process and may reduce data quality due to respondents' distraction (Krosnick 1991). Browser data indicating respondents leaving the survey page allow to non-reactively measure respondents’ multitasking. To investigate whether page-switching respondents produce lower data quality, one has to consider how to identify and delimit this group based on the time they do not spent on the survey page. Given that very short page-switching events might occur due to slips or unintentional behavior they might not be harmful to the response process. According, the aim of this paper is to discuss the adequate time threshold to classify respondents as multitaskers.

Methods & Data: For analyses reported in this paper, two Web surveys among members of a non-probability online panel (n=1,653; n=1,148) and a Web survey among university applicants (n=1,125) conducted in 2018 were used. To measure multitasking the JavaScript tool SurveyFocus (Höhne & Schlosser 2018) was implement. The prevalence of page-switching is computed using different time thresholds (< 2 sec, < 5 sec, < 10 sec). Item-nonresponse, degree of differentiation in matrix questions and characters to open-ended questions serve as measures of data quality.

Results: Preliminary analyses indicate that 15 to 33 percent of respondents multitask at least once in the survey. Previous results on all page switchers also indicate that these respondents do not produce lower data quality. However, so far we did not differentiate between respondents with short or long time absent. The analyses presented in this paper will show whether these results change when different time thresholds are applied. Furthermore, we will investigate whether page-switching respondents differ in their characteristics, their device used and completion time depending on the time they spent absent.

Added Value: Paradata on page-switching provides an opportunity to measure respondents’ multitasking unobtrusively. This paper addresses the challenge to identify multitasking respondents based upon this data to investigate the relationship of multitasking and data quality.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: GOR 20
Conference Software - ConfTool Pro 2.6.127
© 2001 - 2019 by Dr. H. Weinreich, Hamburg, Germany