Conference Agenda

Overview and details of the sessions of this conference. Please select a date or room to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Session Overview
C 6: GOR Thesis Award 2015 Competition II: Bachelor/Master
Thursday, 19/Mar/2015:
17:00 - 18:00

Session Chair: Meinald T. Thielsch, University of Muenster
Session Chair: Frederik Funke, (1) LINK Institut (2)
Location: Room 154
Fachhochschule Köln/ Cologne University of Applied Sciences
Claudiusstr. 1, 50678 Cologne


Sexist Comments in Online Social Networks. How the Degrees of Interpersonal Familiarity and Social Costs Affect the Targets’ Private and Public Responses

Anja Katrin Munder

Westfälische Wilhelms-Universität Münster, Germany

Due to the widespread popularity and various features of different social media sites such as Facebook and Twitter, simulations of these social media platform are promising for use in experimental psychological studies. Well-implemented, they could combine the advantages of high levels of standardisation, high validity and economic online implementation. Especially research on topics of social psychology can be conducted with simulations of social media platforms, as the interaction with other people is pivotal to social media (e.g. chatting, viewing profile sites, reading and reacting to postings).

The present study developed and utilized a simulation of the online social network Facebook to investigate the cognitive, affective and behavioural reactions of a person who is target of sexist discrimination. More specifically, the aim was to examine the impact of interpersonal familiarity between the target person and the discriminating person on the aforementioned reactions of the target person.

Previous research has determined different factors that influence attributional processes, affective responses, and public reactions of people who experience discrimination. However, the potentially important factor of interpersonal familiarity has not been examined yet. To adress this question, 220 participants completed a simulation of Facebook, where they were presented with different postings (status updates, pictures, links to videos etc.) that were published by fictional users, who were indicated as their friends or by unknown users. This was achieved by presenting constructed screenshots in an online survey platform. Cognitive reactions (attributions), affective states (psychological well-being) and behavioural reactions (commenting on a posting, clicking the „like-button“ etc.) were retrieved with the questions formats provided by the survey platform. The manipulation of the degree of interpersonal familiarity of fictional users was successful, as participants significantly rated the familiarity of fictional users indicated as friends higher than of fictional users indicated as unknown.

Participants who experienced discrimination from a friend were less likely to attribute this discrimination as such, but were more likely to confront their fictional counterpart about the discrimination in comparison to participants who experienced discrimination from a stranger. Furthermore, attributing the discrimination as such protected participants’ well-being when the discriminating person was a stranger, but not when the discriminating person was a friend.

Furthermore, it was found in previous studies that people tend to refrain from confronting discriminating behaviour, due to potential negative social consequences (social costs). However, the conducted simulation failed to replicate the inhibiting effect of social costs on public confrontation, likely due to methodological shortcomings.

Practical consequences and implications for further research on the factor interpersonal familiarity in regards to discrimination are discussed. The attempt to implement a simulation of the online social network Facebook in order to manipulate interpersonal familiarity and social costs was partly successful in its intentions. While interpersonal familiarity was manipulated successfully, the factor social costs was not. Participants indicated that they could imagine themselves passably well in the simulation and that their answers were quite similar to their real behaviour on Facebook. Developing and utilizing simulations of social media platforms for further psychological research, especially on topics of social psychology, seems promising. Important issues to consider are the effective operation of experimental manipulations, the technical options (e.g. creating a more interactive interface than screenshots), and the equivalence to face-to-face interactions.

Predicting Response Times in Web Surveys

Alexander Wenz

University of Essex, United Kingdom

Relevance & Research Question:

Survey length is an important factor that researchers have to consider when designing questionnaires. Longer interviews are assumed to impose greater cognitive burden on respondents, which may have a negative impact on data quality. In the course of long surveys, respondents may become fatigued of answering questions and may be more likely to use satisficing response strategies to cope with the cognitive demands. Furthermore, longer surveys lead to increasing costs for questionnaire programming and interviewing.

Despite the impact of interview duration on data quality and costs, survey designers are often uncertain about the length of their survey and only apply rules of thumb, if any, to predict survey length.

The research project presented in this article investigates how item properties and respondent characteristics influence item-level response times to web survey questions. The project builds on the response time analysis carried out by Yan and Tourangeau (2008) and other studies on response times and interview duration and examines whether their findings can be replicated using a different dataset. Finally, the development of a tool for response time prediction is discussed as possible use of the results.

Methods & Data:

The analysis is based on data of the GESIS Online Panel Pilot, a probability-based online panel which consists of German-speaking, Internet-using adults living in Germany. The survey is suitable for studies on response times because it contains a large variety of question types on multiple topics.

Response times to survey items were captured at the respondent’s web browser (client side) by implementing JavaScript code on each survey page. In contrast to response times collected at the web server, client-side response times represent more precise measures of the response process as they do not include downloading times.

Multilevel models are applied to take into account that item-level response times are cross-classified by survey questions and respondents. Assuming that the effect of respondent characteristics is constant over items and the effect of item properties is constant across respondents, a set of random-intercept fixed slope models is fitted. Starting from an unconditional model without any covariates, predictors on the respondent level and the item level as well as cross-level interactions are successively included as fixed effects in the model to account for the observed variation in response times.


The analysis shows that the respondent’s age, education, Internet experience and motivation are significant predictors of response times. Respondents who are younger than 65, have A-levels or Vocational A-levels, use the Internet frequently and are less motivated in survey participation need less time to complete items in web surveys.

Survey participants using smartphones or tablets to complete the survey have longer response times than participants who use desktop computers or laptops. Since the questionnaire of the GESIS Online Panel Pilot was not optimised for mobile devices, mobile respondents may have had problems in reading questions or selecting the appropriate response options.

Among item properties, the complexity of questions and the format of response options were found to have an impact on response times: Survey items with long question texts, many response options and open-ended questions are associated with longer response times, compared to less complex questions with closed-ended response formats.

When comparing the response times to a survey evaluation item across waves which was asked at the end of each survey, it could be shown that respondents become faster in the course of the panel study.

Several results were contrary to prior expectations: The response times of respondents with survey experience who completed at least one survey in the previous year do not differ significantly from the response times of inexperienced survey participants. The position of the item within the questionnaire does not significantly influence survey length, which implies that respondents do not speed up in the course of the survey. Factual questions, not attitude questions, induce the longest response times among all questions types. Moreover, it was found that sensitive items are completed faster than items not dealing with sensitive topics. This finding could be explained by the fact that survey participants may answer sensitive items less thoroughly or may tend to skip these questions. None of the cross-level interaction effects were found to be significant predictors of response times.

Apart from substantive variables, a set of paradata variables which describe the process of questionnaire navigation were included in the model to account for variation in response times. Respondents who scroll horizontally or vertically on the survey page need more time to complete the item on that page. Assuming that response times are an indicator of respondent burden, this finding implies that survey pages have to be adapted to the respondent’s screen size to avoid scrolling. If survey participants leave the survey window, for example to access another webpage in the browser, or revisit survey pages to edit their response, this also has an effect on response times.

Added Value & Limitations:

The present analysis could replicate many findings from previous studies on response times using a probability-based online panel where response times were collected at the client side. However, some results were not in line with previous research and need to be investigated by future studies.

Beyond replication, the study contributes to existing research by examining response times in the context of a panel study and demonstrating that respondents speed up across survey waves. Furthermore, the analysis indicated that paradata describing the process of questionnaire navigation are significant predictors of response times and should be collected and controlled for in future response time analyses.

The limitations of the present study are the very low response rate of the GESIS Online Panel Pilot and the small number of observations in the analysis compared to previous studies on response times. Although the sample sizes are sufficiently large to estimate multilevel models, the statistical power of the fitted models may be reduced. Moreover, the nested structure of survey questions within waves, additional to the cross-classified structure of survey items and respondents, has not been considered due to a small group size at the level of survey waves. To address these shortcomings, response time data from online panels with larger sample sizes and a larger number of survey waves have to be analysed.

Website Evaluation at Different Phases of Website Use

Leonie Flacke

Westfälische Wilhelms-Universität Münster, Germany

Relevance & Research Question: Every day, the average German Internet user spends nearly three hours (van Eimeren & Frees, 2014) on the Internet. Considering the myriad of websites the Internet contains, it becomes obvious that users are forced to make a rigorous selection of which websites to visit. Thus, if you were a website owner, you would need to instantly convince users to regard your website and not any of your very many competitors. But, how would you evoke your users’ interest? What makes them stay on your website or make them even revisit and recommend it at best? Which aspects of your website are imprinted in their memory?

Previous research detected the three core constructs content, usability, which can be further split into subjective and objective usability (Hornbaek, 2006), and aesthetics to influence the users’ perception and evaluation of websites (Cober, Brown, Levy, Cober, & Keeping, 2003; Hartmann, De Angeli, & Sutcliffe, 2008; Thielsch, Blotenberg, & Jaron, 2014). However, only little is known about their interplay. Indeed, Thielsch et al. (2014) developed a path model concentrating on the factors’ effects at four different phases of website use. They found aesthetics to be the critical factor influencing both the first and the overall impression, whereas the content of the website turned out to be decisive for the intention to revisit and to recommend the website.

The aim of this current study was to extend the results of Thielsch et al. (2014) and to enrich it by the measurement of the objective usability (as recommended by e. g. Hornbæk, 2006; Lee & Koubek, 2010) as well as by the distinction of the immediate and deliberate first impression (Leder, Belke, Oberst, & Augustin, 2004; Thielsch et al., 2014). A second time of measurement was implemented in order to investigate the long-term effect of the factors content, subjective usability and aesthetics. Despite the importance of long-term investigation (Mittal, Kumar, & Tsiros, 1999; Karapanos, Zimmermann, Forlizzi, & Martens, 2009) research in the field of website evaluation is still in the early stages. Therefore, the question of the importance of the factors content, subjective usability and aesthetics and how they change over time has been investigated in an explorative manner.

Methods & Data: As mentioned above, the online study consisted of two times of measurement. The participants were recruited via the German online panel ‘PsyWeb’. At the first time of measurement (T1), 306 participants took part. They were randomly assigned to one (out of ten) website. Their age ranged from 16 to 70 years (M = 44.19, SD = 15.09). Of the participants, 163 (53.29%) were female, 142 (46.41%) were male and one participant denied to give information about the gender. On average, the participants had used the Internet for 14.53 years (SD = 4.80). The different phases of website use were operationalized by different presentation durations (immediate first impression: 1 sec, deliberate first impression: 10 sec, overall impression: unlimited, intention to recommend and intention to revisit: website has already been faded out) of the website. The participants were asked to rate their impression of the website at the different phases of website use (dependent variables) as well as the independent variables content, subjective usability and aesthetics via different questionnaires, for instance: WWI + G (Content; e.g. Thielsch et al., 2014)., WWU (Subjective Usability; Moshagen, Musch, & Göritz, 2009, SUS (Subjective Usability; Brooke, 1996; translation of Rauer, 2011) and VisAWI (Aesthetics; Moshagen & Thielsch, 2010). By contrast, the objective usability was operationalized by two interactive tasks the participants were asked to answer, at which the task accuracy served as criterion.

The design of the second time of measurement (T2) was similar to T1, except for the objective usability which could not be measured again. At T2, 224 participants (drop-out rate: 26.80 %) continued with the study on average five days after T1 (M = 118.32 hours). They were not able to regard the website again. Thus, they were requested to judge the website’s content, subjective usability and aesthetics again from memory. The same questionnaires as at T1 were used.

Results: The impact of the factors at the five different phases of website use was calculated by a structural equation modeling (SEM). Apparently aesthetics was the major factor influencing both the immediate and the deliberate first impression, followed by the content. Regarding the overall impression, aesthetics also turned out to be the decisive factor, albeit more densely followed by the subjective usability and the content. By contrast, the content had the largest effect by far on the intentions to revisit and to recommend the website. The pattern was the same at T2 since aesthetics persisted to be the most important factor for the overall impression, whereas the content was the sole decisive factor for both the intention to recommend and to revisit the website.

Added Value: Website owners should definitely not underestimate the importance of aesthetics, since this is the factor which evokes the users’ interest at first. The significant role of aesthetics is emphasized regarding T2, at which it was still the decisive factor for the overall impression. Users mainly remember the aesthetics when judging their overall impression from memory. Thus, aesthetics is not only a nice-to-have but a must-have item since it does indeed have a long-term effect. If you wanted the users to recommend the website to others and to revisit it, you should pay special attention to the content of your website. The results of T1 draw a line to the Elaboration Likelihood Model (Petty & Cacioppo, 1986). Hence, people process the website on the peripheral route at first since they mainly take into account the aesthetics (a superficial factor) to form a judgment of the immediate and deliberate first impression. As the judgment of the overall impression demands a more complex dealing and an interaction with the website, the central route gains in importance, even if aesthetics is still the decisive factor. Regarding the intention to revisit and to recommend the website, the content is now the significant factor. Thus, people now assimilate the website on the central route. The results of T2 confirm the strong effect of the content on the two intentional outcomes. Additionally, this study showed further evidence for the need of a distinction between subjective and objective usability. Thus, if a website owner constructs a perfect objectively usable website, users may yet find the website difficult to use.

Contact and Legal Notice · Contact Address:
Conference: GOR 15
Conference Software - ConfTool Pro 2.6.76
© 2001 - 2014 by H. Weinreich, Hamburg, Germany