Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session
A 5.1: Recruitment and Nonresponse
Time:
Friday, 11/Sep/2020:
1:00 - 2:00

Session Chair: Bella Struminskaya, Utrecht University, Netherlands, The

Presentations

A Systematic Review of Conceptual Approaches and Empirical Evidence on Probability and Nonprobability Sample Survey Research

Carina Cornesse1, Annelies G. Blom1, David Dutwin2, Jon A. Krosnick3, Edith D. de Leeuw4, Stéphane Legleye5, Josh Pasek6, Darren Pennay7, Benjamin Philipps7, Joseph W. Sakshaug8,1, Bella Struminskaya4, Alexander Wenz1,9

1University of Mannheim, Germany; 2NORC, University of Chicago, United States of America; 3Stanford University, United States of America; 4Utrecht University, The Netherlands; 5INSEE, France; 6University of Michigan, United States of America; 7Social Research Center, ANU, Australia; 8IAB, Germany; 9University of Essex, United Kingdom

Relevance & Research Question: There is an ongoing debate in the survey research literature about whether and when probability and nonprobability sample surveys produce accurate estimates of a larger population. Statistical theory provides a justification for confidence in probability sampling, whereas inferences based on nonprobability sampling are entirely dependent on models for validity. This presentation systematically reviews the current debate and answers the following research question: Are probability sample surveys really (still) more accurate than nonprobability sample surveys?

Methods & Data: To examine the current empirical evidence on the accuracy of probability and nonprobability sample surveys, we collected results from more than 30 published primary research studies that compared around 100 probability and nonprobability sample surveys to external benchmarks. These studies cover results from more than ten years of research into the accuracy of probability and nonprobability sample surveys from across the world. We synthesize the results from these studies, taking into account potential moderator variables.

Results: Overall, the majority of the studies in our research overview found that probability sample surveys were more accurate than nonprobability sample surveys. None of the studies found the opposite. The remaining studies led to mixed results: for example, probability sample surveys were more accurate than some but not all examined nonprobability sample surveys. In addition, the majority of the studies found that weighting did not sufficiently reduce the bias in nonprobability sample surveys. Furthermore, neither the survey mode nor the participation propensity seems to moderate the difference in accuracy between probability and nonprobability sample surveys.

Added Value: Our research overview contributes to the ongoing discussion on probability and nonprobability sample surveys by synthesizing the existing published empirical evidence on this topic. We show that common claims about the rising quality of nonprobability sample surveys for drawing inferences to the general population have little foundation in empirical evidence. Instead, we show that it is still advisable to rely on probability sample surveys when aiming for accurate results.



Introducing the German Emigration and Remigration Panel Study (GERPS): A New and Unique Register-based Push-to-Web Online Panel Covering Individual Consequences of International Migration

Jean Philippe Decieux1, Marcel Erlinghagen1, Lisa Mansfeld1, Nikola Sander2, Andreas Ette2, Nils Witte2, Jean Guedes Auditor2, Norbert Schneider2

1University of Duisburg-Essen, Germany; 2Federal Institute for Population Research, Germany

Relevance

With the German Emigration and Remigration Panel Study (GERPS) we established a new and unique longitudinal data set to investigate consequences of international migration from a life course perspective. This task is challenging, as internationally mobile individuals are hard to survey for different reasons (e.g. sampling design and approach, contact strategy, panel maintenance).

Data

GERPS is funded by the German Research Foundation (DFG) and surveys international mobile German citizens (recently emigrated abroad or recently re-migrated to Germany) in four consecutive waves within a push- to- web online panel design. Based on a probability sample, GERPS elucidates the individual consequences of cross-border mobility and concentrates on representative longitudinal individual data.

Research question

This paper introduces the aim, scope and design of this unique push-to-web online panel study which has the potential for analyzing the individual consequences of international migration along four key dimensions of social inequality: employment and income, well-being and life satisfaction, family and partnership as well as social integration.

Results

We will mainly reflect the effectiveness of our innovative study design (register-based sampling, contacting individuals all over the world and motivate them to follow a stepwise push-to-web panel approach). Up to now we successfully conducted two waves (W1: N=12.059; W2: N=7.438) and our 3rd wave is currently in the field. Due to the information available in the population registers, in W1 we had to recruit our respondents postally, aiming to “push” them to a web survey. However, during the following waves we had been able to manage GERPS as online-only panel.

Added Value

These results can be very helpful to international researchers in the context of surveying mobile populations or researchers aiming to implement a push- to- web survey.



Comparing the participation of Millennials and older age cohorts in the CROss-National Online Survey panel and the German Internet Panel

Melanie Revilla1, Jan K. Höhne2,1

1RECSM-Universitat Pompeu Fabra Barcelona, Spain; 2University of Mannheim, Germany

Relevance & Research Question: Millennials (born between 1982 and 2003) witnessed events during their lives that differentiate them from older age cohorts (Generation X, Boomers, and Silents). Thus, one can also expect that Millennials’ web survey participation differs from that of older cohorts. The goal of this study is to compare Millennials to older cohorts on different aspects that are related to web survey participation: participation rates, break-off rates, smartphone participation rate, survey evaluation, and data quality.

Methods & Data: We use data from two probability-based online panels covering four countries: 1) the CROss-National Online Survey (CRONOS) panel in Estonia, Slovenia, and the UK and 2) the German Internet Panel (GIP). We use descriptive and regression analyses to compare Millennials and older age cohorts regarding participation rates, break-off rates, rates of surveys completed with a smartphone, survey evaluation (using two indicators: rate of difficult surveys and rate of enjoyed/liked surveys) and data quality (using two indicators: rate of non-substantive responses and rate of selecting the first answer category).

Results: We find a significantly lower participation rate for Millennials than for older cohorts and a higher break-off rate for Millennials than for older cohorts in two countries. Smartphone participation is significantly higher for Millennials than for Generation X and Boomers in three countries. Comparing Millennials and Silents, we find that Millennials’ smartphone participation is significantly higher in two countries. There are almost no differences regarding survey evaluation and data quality across age cohorts in the descriptive analyses. However, we find some age cohort effects in the regression analyses. These results suggest that it is important to develop tailored strategies to encourage Millennials’ participation in online panels.

Added Value: While ample research exists that posits age as a potential explanatory variable for survey participation and break-off, only a small portion of this research focuses on online panels and even less consider age cohorts. This study builds on Bosch et al. (2018), testing some of their hypotheses on Millennials and older cohorts, but it also extends their research by testing new hypotheses and addressing some of their methodological limitations.