Overview and details of the sessions of this conference. Please select a date or room to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
A5: Data Quality in Surveys
Response Quantity, Response Quality, and Costs of Building an Online Panel via Social Contacts
Tilburg University, Netherlands, The
More than 50% of all survey data in the Netherlands are collected via Internet. However, these data may not adequately represent the views of the Dutch people. The majority of the Dutch people are not willing to join a web panel, and from the people that are in a panel the minority (20%) fills out the majority (80%) of the questionnaires (NOPVO, 2006). Therefore, the answers obtained from web panels can differ significantly from the general population. It is well known that panels contain too many (heavy) Internet users and too few ethnic minorities. So how can we get people into a panel that would normally not join and (hopefully) make the results more reliable?
An unconventional approach is used for building this panel: via social networks. Traditionally one could make the distinction between probability and volunteer opt-in panels. Although most survey researchers agree that probability panels are needed for representativeness, the majority of web surveys is based on volunteer opt-in panels because of budget restrains. Volunteer opt-in panels are prone to selection bias, however. This new way of recruitment may increase representativeness compared to volunteer opt-in panels (recruitment is on invitation only; respondent driven sampling can be used for difficult to reach groups) while keeping the costs at a minimum. By asking respondents via friends and relatives to join the panel, respondents that are normally not willing to join a panel might be persuaded to join. The starting point of building the panel are administrative records of Breda University of Applied Sciences in the Netherlands (about 7000 students with a national spread). I will investigate response quantity, response quality, and costs and give suggestions about when to use this type of recruitment. Note that the Internet penetration rate in the Netherlands is about 90% in 2010.
The effect of monetary prepaid incentives on completion rate and data quality in internet surveys – A comparison of 5 different incentive modes
1Bielefeld University, Germany; 2University of Würzburg, Germany
(a)Relevance & Research Question
How to improve data quality and completion rates in online surveys?
(b)Method & Data
We conducted an online experiment in which 1,750 students were randomly assigned to one out of 4 treatments and a control group. Group 1 received a postal prenotification of the survey along with a prepaid voucher. Group 2 received a postal prenotification and a postpaid voucher. Group 3 received a postal prenotification and a prepaid 5 EUR bank note. Group 4 solely received a postal prenotification. Group 5 was the control group and was invited via e-mail. Dependent measures were completion rate, item-nonresponse, straightlining and willingness to self-report sensitive information.
Logit regression models show Group 3 (prenotification and bank note) to significantly outperform all other groups with respect to completion. Including a bank note roughly doubled completion rate. There were no other significant differences between the treated groups with regard to completion. Treatment had no effect on straightlining. The overall amount of straightlining, however, was low. Regression models for count data revealed item-nonresponse to be lowest in Group 2 (prenotification and postpaid voucher) & Group 3 (prenotification and bank note). Most groups differed from one another in item-nonresponse, whereby all treatment groups were superior to the control group. Finally, treatment had no significant effect on respondents' willingness to disclose sensitive information.
This is one of the first experiments that tests the effect of a prepaid cash incentive on participation in an online study. We are able to show that including a bank note promotes completion and data quality. We therefore advocate the use of prepaid cash incentives. As there were no differences in the willingness to self-report sensitive information, the different treatments seem to be neutral to respondents' perceptions of anonymity.
Social desirability and self-reported health risk behaviors in web-based research: three longitudinal studies
1Maastricht University/CAPHRI, Netherlands, The; 2University of Würzburg, Germany
Relevance & Research Question: These studies sought to investigate the relation between social desirability and self-reported health risk behaviors (e.g., alcohol use, drug use, smoking) in web-based research.
Methods & Data: Three longitudinal studies (Study 1: N = 5612, 51% women; Study 2: N = 619, 60%; Study 3: N = 846, 59%) among randomly selected members of two online panels (Dutch; German) using several social desirability measures (Marlowe-Crowne Scale; Balanced Inventory of Desirable Responding; The Social Desirability Scale-17) were conducted.
Results: Social desirability was not associated with self-reported current behavior or behavior frequency. Socio-demographics (age; sex; education) did not moderate the effect of social desirability on self-reported measures regarding health risk behaviors.
Added Value: The studies at hand provided no convincing evidence to throw doubt on the usefulness of the Internet as a medium to collect self-reports on health risk behaviors.