Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
A11: Increasing Response in Web Surveys
Day of the week and time of the day for survey dispatch. Two large-scale randomized experiments.
University of Gothenburg, Sweden
Relevance & Research Question: Day of the week and time during the day for survey invitations are two of many possible factors that might affect participation rates in web surveys. Using two large scale experimental studies, we examine whether survey dispatch time is something survey practitioners need to take into consideration or not.
Methods & Data: In the first study, respondents were randomly assigned to one out of seven groups, one for each day of the week (n=11 200). In the second study, respondents were randomly assigned to one out of six different times of the day (n=47 279). Survey invitations were dispatched during the fall of 2014 to members of the Citizen Panel, a non-commercial web panel run by the Laboratory of Opinion Research at the University of Gothenburg.
Results: Findings indicate that initially Fridays and Saturdays, closely followed by Sundays, are the worst days to dispatch an online survey. On field day one, these days have significantly lower response rates than other days. However, after three to four days of data collection, the significant differences between dispatch days disappear, even without any follow-up reminder. Similarly, for the second study where time of the day is examined, results show that dispatch time only affects participation rates in the short run, within the first 24 hours. The results suggest that survey practitioners do not need to consider which day of week or what time to dispatch a survey, unless they really want quick answers and a very short period of field work.
Added Value: Further, and somewhat peculiar, respondents are more likely to say that they prefer to answer surveys on the day of the week when they actually received our invitation e-mail in the randomized experiment.
No pay, no gain. The relationship between monetary and non-monetary motivation to participate in web surveys and data quality in an international context.
1Universität Mannheim, Germany; 2Facebook
Relevance & Research Question
Numerous studies have shown a positive association between providing incentives conditional upon survey completion, i.e., an extrinsic motivation, and survey participation. Less focus has been placed on extending the reasoning to explain variations in data quality. Satisficing theory predicts that data quality may suffer when respondent motivation is low. This paper challenges the assumption that non-monetary, intrinsic motivation is preferable to monetary, extrinsic motivation, by studying the link between motivation type and data quality in an international context. We answer two research questions in this study: “How do monetary motivated participants differ from non-monetary motivated participants?” and “What is the relationship between motivation type and data quality?“.
Methods & Data
The data come from a web survey conducted by SurveyMonkey among approximately 35,000 respondents from 6 countries across 5 continents in early 2016. Participants were recruited into the study from various web panels using different recruiting strategies and incentive structures. Personal and survey related characteristics that may influence the type of motivation were identified. The link between motivation and data quality, measured by four indicators as well as a composite measure, was tested using logistic regression models controlling for panel type and country. Interaction effects between motivation and other predictors and controls were explored.
Financial motivation is stronger among younger, low-income, and mobile web respondents, with country and panel-specific differences. In turn, monetary motivation has a positive relationship with data quality. This finding is consistent across countries, panels, and devices. Respondents who are motivated mainly by the incentives rather than curiosity or positive feelings associated with expressing an opinion or taking surveys show a lower propensity to use satisficing strategies such as straightlining or speeding.
This study sheds light on the relationship between self-reported respondent motivation and objective measures of data quality in web surveys. The results suggest that offering incentives that trigger respondents’ extrinsic motivation can increase their levels of attention, possibly by nurturing a sense of obligation and responsibility as a compensation for their monetary gains. The cross-cultural aspect of the study provides external validity.
Personalized Feedback in Web Surveys: Does It Affect Respondent Motivation and Data Quality?
Socio-Economic Panel (SOEP) @DIW Berlin, Germany
Relevance & Research Question: Web surveys technically allow providing personalized feedback to the respondents based on their previous responses. For instance, after collecting information about the respondent’s body weight and height, the web survey system can calculate and display the respondent’s body mass index. We argue that such personalized feedback may motivate respondents to respond more accurately in surveys. While past studies mainly concentrate on the effects of providing study results on future response rates, thus far survey research lacks theoretical and empirical contributions on the effects of personalized, immediate, feedback on response behavior. We seek to address this gap in the literature by investigating the potential advantages and disadvantages of providing personalized feedback within an online survey.
Methods & Data: We implemented a randomized trial in the context of the Berlin Aging Study II (BASE-II) in 2014, providing feedback regarding the respondents’ personality tests (Big Five Personality Inventory) to a subgroup of the sample. We assess (1) whether (the advance notice of) the feedback decreases undesired response behavior, such as item nonresponse, response styles, low reliability, socially desirable responding, or corrective answers and (2) whether the feedback affects respondent satisfaction with the survey.
Results: We found only minor effects of the advance notification of feedback on responses to the Big Five Personality Inventory. Thus, contrary to what was expected, the results do not point to an increase in data quality through the announcement of upcoming feedback. Fortunately, we also do not find evidence for manipulations and adjustments of answers after the feedback was presented to respondents. Finally, we observe a positive effect of feedback on respondent satisfaction and enjoyment with the survey.
Added Value: Our study provides unique and novel insights on how personalized feedback affects response behavior and data quality within web surveys. Further research may build upon our results by incorporating feedback for different questions and topics, varying the style of the feedback, and investigate effects on future survey participation rates.
Contact and Legal Notice · Contact Address:
Conference: GOR 17
|Conference Software -
ConfTool Pro 2.6.96
© 2001 - 2016 by H. Weinreich, Hamburg, Germany