Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
Session Chair: Jessica Daikeler, GESIS - Leibniz-Institute for the Social Sciences, Germany
Location:Room Z28 TH Köln – University of Applied Sciences
Web-push experiment in a mixed-mode probability-based panel survey
David Bretschi, Ines Schaurer
GESIS – Leibniz-Institute for the Social Sciences, Germany
Relevance & Research: In recent years, web-push strategies have been developed in several cross-sectional mixed-mode surveys in order to increase response rates and reduce the costs of data collection. However, pushing respondents into the more cost effective web-option has rarely been examined in the context of panel surveys. This study evaluates how different web-push strategies affect the willingness of mail mode respondents in a mixed-mode panel to switch to the web.
Methods & Data: We conducted a randomized web-push experiment in the October/November wave 2018 of the GESIS Panel, a probability-based mixed-mode panel in Germany (n=5,738). We used an incompletely crossed experimental design with two factors: 1) time of presenting the web-option and 2) prepaid vs. promised incentives by randomly assigning 1,897 mail mode panelists to one of three conditions:
1) the web option was offered concurrently with the paper questionnaire including a promised 10 € incentive for completing the survey on the web,
2) the web option was presented sequentially two weeks before sending the paper questionnaire and respondents were also promised an incentive of 10 €,
3) same sequential approach as group 2, but with a prepaid 10 € incentive instead of a promised incentive.
We examine how conditions differ on the web response rate of mail mode respondents, the proportion of respondents who agreed to switch to the web mode for future waves, and other respondents related variables.
Results: Contrary to our expectation, preliminary results show that prepaid incentives do not affect the web response rate compared to promised incentives. However, there is a tendency that a sequential presentation of the web-option increases the web response rate in contrast to offering the web mode concurrently. Final results of our study will be available in January 2019.
Added Value: This study will provide new evidence on the effect of web-push methods in mixed-mode panel surveys. Our findings may contribute towards better understanding of mode choice and mode switching of participants in probability-based longitudinal studies.
Push-to-web recruitment of a probability-based online panel: Experimental evidence
Ulrich Krieger1, Annelies Blom1,2, Carina Cornesse1, Barbara Felderer1, Marina Fikel1
1SFB 884, University Mannheim; 2Department of Political Science, University of Mannheim
Relevance & Research Question:
Past research has shown that pushing respondents to the web is a successful way to increase response rates, reduce data collection costs, and produce representative outcomes. However, studies in that literature are usually limited to cross-sectional surveys on small and homogeneous target populations. Our study rises beyond this limited scope to a broad and, so far, unique application: We investigate the relative success of pushing respondents to the web compared to alternative survey design strategies across the recruitment stages of a probability-based online panel. In order to do this, we implemented a large-scale experiment into the 2018 recruitment of the German Internet Panel (GIP).
Methods & Data:
In this experiment, we sampled 9,800 individuals from population registers and randomly assigned each individual to an experimental group: online-only, online-first, offline-first, or concurrent-first. Individuals in the online-only group received a postal mail invitation to participate in the web version of the GIP recruitment survey. Nonrespondents in the online-only group were followed up by invitations to the web version of the GIP recruitment survey again. Individuals assigned to the online-first group received the same invitation letter as the online-only group asking them to participate in the web version of the GIP recruitment survey. However, nonrespondents were followed up with a reminder letter containing a paper-and-pencil version of the GIP recruitment survey. Individuals in the offline-first group received the paper-and-pencil questionnaire with the initial invitation letter and were followed up with invitations to the web version of the GIP recruitment survey. Individuals in the concurrent-first group were initially given the choice between participating in the web version of the GIP recruitment survey or the paper-and-pencil version.
Results: Early results show an about 23% percent recruitment rate for the online-first group and lower rates for the other groups. Using paper questionnaires results in higher initial response rates but these respondents but bringing those respondents to the web is challenging.
Added Value: Our research shows the feasibility of postal recruitment to a web panel. We compare different recruitment strategies and their effect on sample composition.
Timing your web survey: Effects of variations in time of contact, respondent’s completion behaviour and data quality outcomes in a course evaluation setting
Ellen Laupper, Lars Balzer
Swiss Federal Institute for Vocational Education and Training SFIVET, Switzerland
Relevance & Research Question: Timing effects in survey research have gained new topicality, as the questionnaire administration via the internet allows contacting all survey participants at the exact same time. Moreover, with the availability of paradata, e.g. starting time and total duration of questionnaire completion, a deeper analysis and understanding of respondents’ completion behaviour is possible. This is especially interesting given the fact that little is known on how timing factors are influencing respondents’ completion behaviour such as non-response, recall effects or response delay in web surveys (e.g. Estelami, 2015; Karapanos, Zimmerman, Forlizzi, & Martens, 2010; Lewis & Hess, 2017). The proposed study wants shed more light on the interplay between survey timing, coverage and recall effects and their effects on data quality.
Methods & Data: A between-subject web survey experiment was implemented in order to examine how different time points of contact (at 8 p.m. on the day of course completion (d0), at 8 p.m. three days after course completion (d3) and at 8 p.m. one week after course completion done (d7)) relate to differences in data quality (e.g. response rate, drop out, item non-response, interview duration, overall course satisfaction, straight-lining and other satisficing behaviour). Around 1000 participants attending one of nine 1-day refresher courses for examiners in vocational education and training (VET) were randomly assigned to one of the three experimental groups.
Results: A mediation model using Mplus was tested with time of contact as a multi categorical predictor, response delay as the mediator and seven data quality indicators as outcomes. It seems that, although sending the course evaluation invitation email directly after the course results in a significantly higher response rate, data quality effects seem to be mainly related to response delay.
Added Value: This study makes an important contribution beyond previous research by bringing together separately researched theoretical considerations and results on survey timing aspects like time of contact, response latency as well as on total response time and thus offers a new view into the possible dynamics of how survey timing affects respondents survey behaviour and as result survey data quality.