Logo

General Online Research 2011

March 14-16, 2011, Heinrich-Heine University of Düsseldorf

DGOF Logo

Conference Agenda

Overview and details of the sessions of this conference. Please select a date or room to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session

A1: Respondents' Answer Behavior

Time: Tuesday, 15/Mar/2011: 10:30am - 11:30am
Session Chair: Martin Kornmeier

Presentations

Respondent Characteristics as Explanations for Uninformative Survey Response: Sources of Nondifferentiation in a Web-Panel

Lex Van Meurs1, Thomas Klausch2, Klaus Schönbach3

1Intomart GfK, Netherlands, The; 2University of Utrecht; 3University of Vienna

Relevance & Research Question: Self-administered online surveys put respondents into an essentially anonymous and uncontrolled response situation. This raises worries on potentially biased or uninformative answers, such as nondifferentiation – always using the same score on all items offered – which may harm the measurement accuracy of population statistics. Our presentation explores the question which respondents are inclined to give such answers.

Methods & Data: For our study, longitudinal observations from a large commercial online survey panel in The Netherlands were available: the Appreciation Panel (fieldwork by Intomart GfK on behalf of NPO, the Dutch Public Broadcasting Organisation. Nondifferentiation behavior was identified in every single survey of the panel for a time frame of six months in 2009 (totaling to 502,750 completed online questionnaires). In this way a history of panel (nondifferentiation) behavior was created for each of over 7,700 active panel members. Subsequently a cross-sectional online survey was designed to survey possible determinants of response behavior. The survey was conducted post-hoc with a stratified probability sample of 1,200 respondents.

Results: Analyses based on data from a large-scale online panel indicate that not only respondents’ perception of effort caused by a survey explains their behavior. Also more abstract social behavioral norms, individual moral obligations and the norm of ‘honest behavior’ are related to nondifferentiation behavior. However, extrinsic motivation to participate in the panel because of a monetary incentive is found unrelated. These results imply that survey researchers have somewhat limited ways to reduce the effects of factors causing uninformative behaviors. Using monetary incentives to encourage panel participation is not harmful to the quality of answers, but it is recommended to limit respondents’ perception of effort.

Added Value: Very few examples have been published about nondifferentiation in applied online market research. The method presented offers an example of applied research what respondents are inclined to give nondifferentiated responses and how nondifferentiation in combination with other indicators such as response time is used to identify low quality responses in online research.

Van Meurs-Respondent Characteristics as Explanations-171.pptx

Effects of survey question clarity on data quality

Timo Lenzner

GESIS - Leibniz Institute for the Social Sciences, Germany

Relevance & Research Question: Many studies found that the wording of a survey question can influence the answers that respondents provide. In particular, it has been shown that vague and ambiguous terms are often interpreted idiosyncratically by respondents, and thus can introduce a systematic bias into the survey data. In addition to ambiguity, the cognitive effort required to understand survey questions may affect data quality in a similar way. Earlier research identified several problematic text features (such as low-frequency words, left-embedded syntactic structures, low syntactic redundancy) that reduce question clarity and make survey questions difficult to comprehend (e.g. Lenzner, Kaczmirek, & Lenzner, 2010). This paper extends the earlier findings and examines whether the effort required to comprehend survey questions affects data quality.

Methods & Data: An experiment was carried out in which respondents were asked to complete two Web surveys (N1=825, N2=515) at a two-week interval. Approximately half of the respondents answered questionnaires that included unclear and less comprehensible questions, the other half received control questions that were easier to comprehend. Indicators of data quality were drop-out rates, number of non-substantive responses (“Don’t know’s”), number of neutral (midpoint) responses, and over-time consistency of responses across the two surveys. In addition, respondents’ verbal intelligence and motivation were assessed to examine whether question clarity effects were moderated by these two respondent characteristics.

Results: As expected, respondents receiving unclear questions provided lower-quality responses than respondents answering more comprehensible questions. Moreover, some of these effects were more pronounced among respondents with limited verbal skills and among respondents with low motivation to answer surveys.

Added value: These findings indicate that survey results can be systematically biased if questions are difficult to understand and exceed the processing effort that respondents are willing or able to invest. Making it easy for respondents to retrieve the meaning of a survey question seems to be an important requirement for obtaining high-quality answers.

Lenzner-Effects of survey question clarity on data quality-111.pdf

Speeders in Online Value Research: Cross-checking results of fast and slow respondents in two separate samples answering the 40 item "Portrait Value Questionnaire"

Tilo Beckers1, Pascal Siegers2, Anabel Kuntz2

1Heinrich-Heine-Universität Düsseldorf / University of Düsseldorf, Germany; 2Universität zu Köln / University of Cologne, Germany

(a) Relevance & Research Question:

Social scientists are often reluctant to rely on data from online access panels or other web surveys because they fear that the general data quality may be seriously flawed and the results are thus neither valid nor reliable . The third major concern of representative sampling is not always important, e.g. in case of experimental designs or studies that are only used for analyzing relationships as opposed to comparing distributions and mean values. Our research question is, whether speeders in online surveys jeopardize results due to a lack of validity and reliability of results.

(b) Methods & Data:

In a research project on basic human values, fielded in 2010, we have implemented an online questionnaire and gathered data using the ‘Unipark’ online access panel. The 20 minute instrument includes among other measurements the 40 item Schwartz Portrait Value Questionnaire (PVQ) over five consecutive pages. We use confirmatory factor analyses (CFA) and structural equation models (SEM) to analyze our data.

(c) Results:

When analyzing the data we hardly faced any item non-response but have discovered many comparatively fast respondents using different speed thresholds. In order to cross-check our results we compared the results of CFA and SEM from two separate surveys (each: n > 750) using independent samples and we compare models from respondents at different speeds. We found a high degree of homogeneity between the two different samples and the models of slow respondents and speeders while controlling for straightliners on our Likert scale.

(d) Added Value:

The results indicate that value research may profit from online access panels both to pretest complex instruments and to validate results, e.g. by applying structural equation models. Although the quality of representative samples is not yet given for most online panels, the tentative results of our fundamental research indicates that speeders do not necessarily jeopardize the quality of the data and thus the validity and the reliability of results. This is an important insight for those social scientists usually relying on cost-intensive traditional PAPI and CATI data collection.

Beckers-Speeders in Online Value Research-172.pdf

 
Imprint · Contact Address:
Conference: GOR 2011
Conference Software - ConfTool Pro 2.6.17
© 2001 - 2010 by H. Weinreich, Hamburg, Germany