Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session
C11: Respondent Motivation
Time:
Friday, 17/Mar/2017:
15:15 - 16:15

Session Chair: Jean Philippe Décieux, Université du Luxmebourg, Luxembourg
Session Chair: Philipp Sischka, University of Luxembourg, Germany
Location: A 025

Show help for 'Increase or decrease the abstract text size'
Presentations

Exploring the Influence of Respondents' IT Literacy on Nonresponse in an Online Survey

Jessica M. E. Herzing1, Annelies G. Blom1,2

1Collaborative Research Center “Political Economy of Reforms”, University of Mannheim; 2School of Social Sciences, University of Mannheim

Relevance & Research Question:

Researchers have expressed concern about the generalizability to the general population of estimations based on online surveys. While much of this discussion circles around the suitability of nonprobability sampling methods, the lack of coverage of persons without computers and/or internet has also received attention. Probability-based online panels only account for this potential source of error, if they specifically cover the offline population, for example by equipping them with devices and internet connection. However, even when covered, offliners tend to be underrepresented in the final sample because of nonresponse.

Research in this area has thus far considered the underrepresentation of sample units in online surveys to be a binary phenomenon: sample units were either offline or online. In this paper, we extend this binary characteristic into the multi-dimensional characteristic of IT literacy. We use IT literacy to predict nonresponse in the German Internet Panel at the first online interview and across waves.

Methods & Data:

For this purpose we run a Latent Class Analysis to identify different classes of IT literacy. To assess differences in response probabilities with regard to IT literacy and panel wave, we run logistic regressions.

Results:

We find that respondents who belong to different classes of IT literacy have systematically different socio-demographic characteristics and show different voting behavior. In addition, we find that response propensities vary by classes of IT literacy, both at the first online interview and regarding retention over time.

Added Value:

This paper is the first to consider a multi-dimensional classification of IT literacy and its value to predicting nonresponse in online surveys.


Implications of disposition codes for monitoring breakoffs in web surveys

Gregor Čehovin, Vasja Vehovar

University of Ljubljana, Slovenia

Relevance & Research Question: Respondents quitting surveys prematurely (breakoffs) require special attention in web surveys because they occur more often there than in interviewer-administered questionnaires. In addition, the collection of paradata in web surveys enables a more precise measurement of breakoffs. In our study, we compare: 1) introduction breakoffs (occurring at the start of a questionnaire) vs. 2) questionnaire breakoffs (occurring at some later point in the questionnaire) and define them as separate types. We provide a conceptual framework that relates both breakoff types to the AAPOR Final Disposition Codes for Internet Surveys, and propose monitoring breakoffs in web surveys in greater detail. We discuss the practical applications of this approach in a metastudy of 7,676 web surveys.

Methods & Data: Our sample is based on approximately 1,250,000 responses to 7,676 web surveys, which were conducted from 2009 to 2014 using the 1KA open source survey software. To analyse the impact of survey characteristics and email invitations on introduction breakoffs vs. questionnaire breakoffs (dependent variables), we used linear regression models with the number of pages and items in the questionnaire, as well as whether the survey was disseminated using the survey software’s email invitation system as predictors.

Results: Our empirical study shows that questionnaire length only impacts questionnaire breakoffs, and that email invitations only impact introduction breakoffs. We can on average expect the questionnaire breakoff rate to increase by 0.07 of a percentage point for each additional item in the questionnaire, or by 0.17 of a percentage point for each additional page. The introduction breakoff rate is on average expected to decrease by 16.6 percentage points if the survey software’s email invitations are used. The sample’s mean total breakoff rate is 43%, where the introduction breakoff strongly dominates (about three-quarters of all breakoffs).

Added Value: Separately defining introduction vs. questionnaire breakoff allows for a more accurate analysis of related causes because fundamentally different factors contribute to each type. This holds practical importance for survey methodology, especially in terms of breakoff prevention methods and accurately reporting on various missing data and data quality aspects.


Čehovin-Implications of disposition codes for monitoring breakoffs-130.pptx

Continuity of Web-Survey Completion and Response Behavior

Jan Karem Höhne, Stephan Schlosser

University of Göttingen, Germany

Relevance & Research Question:

Web surveys are increasingly used for data collection in social science research since they offer several substantial benefits: cost-effectiveness, saving time, and most importantly, they enable researchers to capture a variety of paradata (e.g., response times). Web mode, however, might also support respondent’s distraction during survey completion due to "multi-tasking" (e.g., checking incoming emails, changing to other websites, or starting programs). Until now, it lacks empirical evidence in which specific way distraction during survey participation affects the response behavior of respondents.

Methods & Data:

In this study, we, therefore, investigate whether there are systematic differences between respondents who process the survey continuously and those do not. For this purpose, we use a new paradata tool called "SurveyFocus (SF)" – enabling survey researchers to gather the activity of the web-survey page. This cross-sectional study (n = 1,751) is based on an onomastic sampling approach and contained single as well as grid questions.

Results:

Our statistical analyses reveal substantial differences between continuously and discontinuously processing respondents. This implies that respondents who leave the web survey for a certain time produce significantly longer processing times (after correcting for the "time-out"). They additionally produce lower response quality in terms of item non-response and error of central tendency. Furthermore, there are considerable differences between single and grid questions.

Added Value:

All in all, our empirical findings suggest that the continuity of web-survey processing matters. For this reason, survey researchers and practitioners should take this circumstance into consideration when analyzing and interpreting web-survey data.


Höhne-Continuity of Web-Survey Completion and Response Behavior-110.pdf


 
Contact and Legal Notice · Contact Address:
Conference: GOR 17
Conference Software - ConfTool Pro 2.6.96
© 2001 - 2016 by H. Weinreich, Hamburg, Germany