18-20 March 2015
Cologne University of Applied Sciences, Germany
Overview and details of the sessions of this conference. Please select a date or room to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
A 6: Enhancing Survey Response
How the Timing of Informed Consent on Paradata Use Affects Response Behavior
Johannes Gutenberg University Mainz / Department of Political Science / German Politics and Political Sociology, Germany
Relevance & Research Question: Analyzing response behavior using client-side paradata raises ethical questions. In accordance with informational self-determination, German market and social research (ADM et al. 2007) require researchers to ask respondents for consent prior to non-reactive data collection. However, empirical studies indicate that any information about paradata use reduces the willingness to participate in the survey (Couper/Singer 2013). This study examines whether further negative effects of informed consent depend on the timing: Being informed at the end of the survey might provoke drop-outs and refusals of paradata use, whereas being informed at the start might build-up trust, but lead to biased response behavior.
Methods & Data: Cognitive interviewing (n = 12) was conducted to investigate whether the respondents' perception of paradata varies with the timing of informed consent. In a second step, a split-ballot online experiment (n = 754) was used to evaluate negative effects. Apart from an ex-post informed control group, the respondents were offered the option to reject paradata use – either at the beginning or at the end of the survey.
Results: Cognitive interviewing suggests that paradata are regarded as a borderline case of privacy intrusion. The fear of losing one's informational self-determination results from the inability to know what conclusions the researchers draw from paradata. Data misuse is more often discussed when being informed at the end of the survey. In contrast, the front placement seems to make respondents reconsider their answers and complete especially “simple” questions more quickly. However, these effects might weaken over time. The online experiment investigates whether the end position of informed consent increases drop-outs and refusals of paradata use, whereas the front position might affect the response distribution (e. g. item nonresponse, extreme or midscale ratings) and the response process (e. g. response latencies, answer changes, navigation).
Added value: Combining qualitative and quantitative methodology, this paper adds new results to the research on informed consent on paradata use by analyzing how the timing can assist in minimizing drop-outs, refusals of paradata use and biased response behavior.
A quasi-experiment on effects of prepaid versus promised incentives on participation in a probability-based mixed-mode panel
1GESIS - Leibniz Institute for the Social Sciences, Germany; 2Free University of Bozen-Bolzano, Italy
Relevance & Research Question:
Research on cross-sectional surveys has shown that prepaid, unconditional incentives are more effective than postpaid incentives. However, no such evidence is available for self-administered longitudinal mixed-mode surveys of the general population. With the emergence of probability-based panels in the social sciences, identifying the optimal incentive timing strategy (prepaid, postpaid) to increase panel survey participation becomes paramount. In our presentation, we will address the research question whether prepaid monetary incentive is superior to postpaid monetary incentive in terms of panel survey participation.
Methods & Data:
We use data from the recruitment phase of the GESIS Panel – a probability-based mixed-mode panel (online, offline using mailed surveys) of the German population. Respondents had been recruited offine (CAPI surveys) and are invited to self-administered surveys every second month. Recruitment took place during May and December 2013. We conducted a quasi-experiment encompassing two groups. The control group (n=4340) consisted of respondents who answered the first regular self-administered survey in 2013. They were promised an incentive of 5 Euro for survey participation. To redeem the incentive, they had to provide their bank account data. The experimental group (n=589) consisted of respondents who answered their first self-administered survey in January 2014. They received a five Euro bill enclosed into a mailed invitation.
Results: First analyses show a substantial increase of survey participation from 77% in the control group to 90% in the experimental group (z = -7.21, p< 0.01). A separate analysis of online and offline respondents show that the increase is more pronounced in the group of offline respondents (66% control, 88% experimental) in comparison to online respondents (84% control, 93% experimental). Besides participation rates, the use of prepaid incentives send out by letter had a positive side effect for panel maintenance, because respondents were highly motivated to keep their address information up to date.
Added Value: The findings suggest that prepaid incentives increase panel survey participation rates compared to promised incentives in a probability-based mixed-mode context. Moreover, sending out prepaid incentives tend to increase panelists´ motivation to keep their contact addresses up to date, and hence, facilitating panel maintenance.
On the Impact of the Presentation Form of Vignettes and the Choice of Response Scales on the Answering Behavior in Vignette Studies
University of Cologne, Germany
Relevance & Research Question:
Vignette studies gained a lot of attraction in the field of social science in the past decade. Some authors emphasize that the use of vignette studies promises more valid and reliable measurements than survey items that are typically used in conventional surveys. Moreover, as vignette studies are based on an experimental plan, they allow to quantify for every respondent the magnitude for each of the presented factors on his or her ratings. Despite the attraction and the advantages of vignette studies compared to survey items, method effects among vignette studies have been rarely investigated.
Methods & Data:
In this split-ballot experiment, I examine in how far the choice of the presentation form of vignettes and the choice of the response scales affect the respondents' answering behaviors. For the purpose of the study, I designed an internet-based vignette study, in which every respondent was presented several vignettes that showed either a textual or tabular presentation. The interviewees were able to enter their answers either in an open or in a closed response format. The study is based on a quota sample for Germany and comprises five experimental groups.
The study results are discussed in terms of interview time (cost aspect) and information losses (quality aspect) and allow to pronounce recommendations for researchers interested in conducting a vignette study. The results suggest the use of a tabular presentation form with altering varying information and a closed response format if the goal is to reduce the interview time (and thus costs). If the goal is to reduce the number of missing observations, it is recommended to design vignettes as table vignettes in which varying information do not alternate, and to offer his interviewees closed response formats.
Reduce survey costs and increase data quality in vignette studies.
Contact and Legal Notice · Contact Address:
Conference: GOR 15
|Conference Software -
ConfTool Pro 2.6.76
© 2001 - 2014 by H. Weinreich, Hamburg, Germany