A03: Recruitment of Respondents and Participants
Using Cash Bonuses for Early Participation to Improve Postal Recruitment of a Probability-Based Online Panel
SFB 884, University Mannheim
Relevance & Research Question
Past research has shown that cash incentives are effective to increase response rates and recruitment in panel studies. However, there are gaps in the literature when it comes to the use of bonus incentives for early participation to facilitate the recruitment effort.
Our aim is to use conditional cash incentives for early participation to encourage respondents to sign up early during the field period, hence reducing the subsequent reminder effort. In addition, a smaller sample is needed to reach the recruitment goal.
We test the effectiveness of bonus incentives in increasing sample sizes and reducing fieldwork cost. In addition, we test for possible negative effects of the incentive treatment such as sample bias or early panel drop out.
Methods & Data
To test the effects of bonus incentives we implemented a large-scale experiment in the 2018 recruitment of the probability based German Internet Panel (GIP). The recruitment was based on a population register and conducted via postal mail.
For the experiment, 4800 sample cases were randomly assigned to three treatment goups: 1200 sample cases recieved a 50€ bonus incentive for early registration, 1200 a 20€ early registration bonus, and 2400 were assigned to the control goup that did not receive any bonus incentive. All sample members received a postal mail invitation including login information for the online recruitment survey including a 5€ unconditional cash incentive.
First analyses show, that the response rate is higher for the bonus groups than the control group but does not differ between the two bonus groups. No effect of the treatment on the sample composition of gender, age and german-non german citizenship can be found.
Our study informs about the short and long term benefits and risks of bonus incentives on early registration for recruting respondents.
Text Message Invitations as a new way to conduct population wide online surveys? – Biases and Coverage Issues
GESIS Leibniz Institute for the Social Sciences, Germany
Relevance & Research Question:
Online surveys are the most popular survey mode today. Besides all their advantages, sampling can be a downside of online surveys. Especially population-wide probability sampling is a great challenge for most web-based surveys. Therefore, we explore the usage of a sample consisting of randomly generated numbers, to recruit respondents via text message. Our goal is to assess if this method is feasible to conduct population-wide online surveys and by which biases such a survey might be influenced.
Methods & Data:
We used modified RDD sampling to create a sample of privately used mobile telephone numbers and then used an HLR Lookup procedure to remove invalid numbers. Afterward, we contacted the remaining numbers via SMS, which includes a short text about the length of the questionnaire, the name of our institution and a link to the survey. Respondents were randomly assigned to one of three questionnaires that differed in length (5, 10 or 20 Minutes). All questionnaires contained a core part, concerning demographics, selected attitudinal questions, questions concerning the political behavior, and questions regarding data linkage. The data was collected in November 2018.
In our analysis, we will investigate biases in the sample and compare our results to population benchmarks such as the turnout of the last election. Furthermore, we will compare the results of the data linkage part, to results data we obtained with an online access panel. Additionally, we will analyze the questionnaire length experiment.
The survey is in the field at the moment. We send a total of 11820 Messages. Thus far, the response rate is relatively low.
Our study will help to evaluate a new sampling method for web surveys. Analyzing biases created by the sampling method can help to identify for which purposes our procedure is feasible and how data quality might be affected. Furthermore, our experimental design will help to give recommendations regarding questionnaire length that is most suitable for an online survey based on SMS sampling.
Participant Recruitment Methods can Affect Research Outcomes: Personality Biases in Different Types of 'Online Sample'.
University of Westminster, United Kingdom
Relevance & Research Question: Samples for online research are recruited in a number of ways (e.g. online panels, volunteer requests, crowdsourced labour marketplaces such as MTurk). Panel providers and researchers correctly pay attention to data quality and demographic representativeness of participants. However, biases in sample makeup with respect to motivation or personality are seldom considered. Could participants recruited in different ways produce different findings? For example, Openness to Experience is known to affect political voting preference. Might a sample skewed towards high Openness behave differently to one with a bias towards low Openness in research on political preference?
Methods & Data: In Study 1, a pseudo-experimental design compared personality scores of students participating for course credit, with those of individuals completing an online personality questionnaire on a voluntary basis. In Study 2, a correlational design explored whether personality affected political voting preference. In Study 3, personality scores of volunteers were compared with those of members of a commercially sourced, paid, online research panel. The potential effect on voting preference was evaluated.
Results: Results generally indicated that personality profiles were influenced by recruitment method. Volunteers had lower Extraversion, lower Agreeableness, lower Conscientiousness, higher Neuroticism, and higher Openness to Experience than people participating as a class requirement. Study 2 showed that Openness differences of the magnitude seen in Study 1 could affect voting preference. Study 3 demonstrated that findings obtained with volunteer participants were not replicated with paid panel members.
Added Value: This project extends existing work on online data quality, showing that personality biases may arise from recruitment methods. It demonstrates that these differences, while small, could affect research outcomes in meaningful ways.