Conference Agenda

Overview and details of the sessions of this conference. Please select a date or room to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session
A 3: Mobile Web Surveys
Time:
Thursday, 19/Mar/2015:
12:00 - 13:00

Session Chair: Bella Struminskaya, GESIS - Leibniz Institute for the Social Sciences
Location: Room 248
Fachhochschule Köln/ Cologne University of Applied Sciences
Claudiusstr. 1, 50678 Cologne

Presentations

Exploring Why Mobile Web Surveys Take Longer

Mick P. Couper1,2, Gregg Peterson1

1University of Michigan, United States of America; 2University of Maryland, United States of America

Relevance and Research Question:

Surveys completed on mobile Web devices (smartphones) have been found to take longer than surveys completed on a PC. Several explanations have been offered for this finding: 1) slower connection speeds over cellular or Wi-Fi networks, 2) the difficulty of reading questions and selecting responses on a small device, and 3) the increased mobility of mobile web users, who have more distractions while answering Web surveys. Our analyses attempt to disentangle the sources of these differences.

Methods and Data:

We use data from three iterations of a campus Web survey, administered to samples of students, faculty, and staff in 2012, 2013, and 2014. Of the 4,725 respondents who started the 2012 student survey, 10.8% used a smartphone. We have both server- and client-level times for all items (over 360,000 item-level observations) that permit us to disentangle between-page (transmission) times from within-page (response) times. We have detailed information on the types of questions asked (e.g., grid questions versus single-item questions) and client-side paradata on other behaviors (such as scrolling or backing up). Using these data, we plan to build multi-level models (items nested within respondents) to explore correlates of response time by device.

Results:

Our initial analyses on the 2012 student survey found that significantly more mobile than PC users broke off (28% versus 13%) and significantly more of those who completed the survey did so in multiple sessions (28% versus 10%). Mobile respondents also took significantly longer to complete the survey (median of 839 seconds) than PC respondents (median of 951 seconds). The majority of the difference is due to within-page times.

Added Value:

Understanding why differences in response times between PC and mobile users occur and attempting to minimize them is an important step in reducing the higher nonresponse rates and breakoff rates observed for respondents completing Web surveys on small mobile devices.

Couper-Exploring Why Mobile Web Surveys Take Longer-113.pdf

Device choice in web surveys: The effect of differential incentives

Aigul Mavletova1, Mick P. Couper2

1NRU Higher School of Economics; 2Institute for Social Research, University of Michigan

Relevance & Research Question: Previous studies have not found efficient ways either of discouraging or encouraging respondents to use smartphones to complete web surveys. We suggest that conditional differential incentives (different incentives depending on the device the respondent uses to complete the web survey) can increase the overall participation rates and the proportion of respondents who use a particular device to complete the survey.

Methods & Data: We conducted an experiment using a volunteer online access panel in Russia. In total, 2,086 completed interviews were collected with the average participation rate of 38%. Smartphone owners who are regular mobile Internet users were invited to complete the web survey. We varied the invitation mode (SMS vs. e-mail) and encouragement to use a particular device for completing the survey:

1. Typical incentives (50 roubles), no encouragement for device

2. Typical incentives (50 roubles), encouragement to use a mobile phone

3. Typical incentives (50 roubles), encouragement to use a PC

4. Higher incentives for mobile phone, typical for PC: 75 roubles vs. 50 roubles

5. Higher incentives for mobile phone, typical for PC: 100 roubles vs. 50 roubles

6. Higher incentives for PC, typical for mobile phone: 75 roubles vs. 50 roubles

Results: The participation rate in the control condition was 35.3%. SMS increased the proportion of mobile web respondents, while e-mail increased the proportion of PC web respondents. As expected, differential incentives increased the overall participation rates (by 8-10 percentage points) if higher incentives were offered for completing the survey via mobile phone. Contrary to expectations, offering higher incentives to PC web respondents did not produce higher participation rates compared to the control condition. Offering higher incentives to mobile web respondents also had an effect on sample composition. Significantly higher participation rates were found among females and those with higher education.

Added Value: Conditional differential incentives can increase the participation rates in web surveys.

Mavletova-Device choice in web surveys-123.pdf

Responsive Questionnaire Design for Higher Data Quality in Mobile Surveys

Frederik Funke, Carmen Borger

LINK Institut, Germany

Relevance & Research Question: In 2014, about 22% of German Internet users accessed the Internet with mobile devices on a daily basis (see van Eimeren & Frees, 2014). However, small screens on smart phones challenge questionnaire designers. Mainly grid questions on mobile devices seem to increase response burden. If all items are to be presented on a single screen, small font sizes and small response options are required. Or, if not all response options are visible on load, additional scrolling is needed which might bias ratings (see Couper, Tourangeau, & Conrad, 2004). To determine the impact of questionnaire customization on data quality, we tested a responsive design that adapts the questionnaire depending on the device used for participation.

Methods & Data: In a split ballot design N = 3929 respondents were recruited in five online panels. They were either directed to a classic design questionnaire (i.e., the same layout for all devices, not optimized for touch screens, and grids on a single page) or to a responsive design questionnaire (i.e., layout depending on available screen size, optimized for touch screens, and grid questions split into single questions). Two grids and two quality check items (e.g., “click the second response option“) were included. Analyses are based on N = 428 (10.9%) respondents who used a smart phone to access the survey.

Results: We found less straight lining in the responsive design (8.6% vs. 16.8% in the classic design), which is an indicator of a beneficial effect on data quality. Completion with the responsive design took some longer (M = 18.9 vs. 17.0 minutes). This, however, seems to be due to deeper question processing as items for quality checks were more frequently answered correctly (90% vs. 78%). Furthermore, respondents gave better evaluations of the survey. No considerable differences were observed, though, regarding item nonresponse, the number of characters in open questions, and the distribution of ratings (e.g., means and top-2 boxes).

Added Value: The results provide further evidence that the questionnaire design should preferably adapt to the type of device respondents use to participate. A responsive design meets mobile participants’ needs and expectations and helps improve data quality.


 
Contact and Legal Notice · Contact Address:
Conference: GOR 15
Conference Software - ConfTool Pro 2.6.76
© 2001 - 2014 by H. Weinreich, Hamburg, Germany