Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
C2: Misinformation
Time:
Thursday, 09/Sept/2021:
2:00 - 3:00 CEST

Session Chair: Anna Rysina, Kantar GmbH, Germany

Show help for 'Increase or decrease the abstract text size'
Presentations

Emotional framing and the effectiveness of corrective information

Pirmin Stöckle

University of Mannheim, Germany

Relevance & Research Question:

Concerns about various forms of misinformation and its fast dissemination through online media have generated huge interest into ways to effectively correct false claims. An under-explored mechanism in this research is the role of distinct emotions. How do emotional appeals interact with corrective information? Specifically, I focus on the emotion of disgust, which has been shown to be linked to the moralization of attitudes, which in turn reduces the impact of empirical evidence on attitudes and makes compromise less likely. Substantively, I investigate the issue of genetically modified (GM) food. I hypothesize that (i) emotionally framed misinformation induces disgust and moralizes attitudes towards GM food, (ii) that this effect endures in the face of neutral correction even if the factual misperception is corrected, and (iii) that an emotional counter-frame reduces this enduring effect of the original frame.

Methods & Data:

I implement a pre-registered survey experiment within a panel study based on a probability sample of the general population in Germany (N ≈ 4,000). The experiment follows a between-subjects 3 x 3 factorial design manipulating both misinformation (none, low-emotion frame, high-emotion frame) and corrective information (none, neutral, emotional counter-frame). The informational treatments consist of fabricated but realistic online news reports based on the actual case of a later retracted study claiming to find a connection between GM corn and cancer. As outcomes, I measure factual beliefs about GM food safety, policy opinions, moral conviction, and emotional responses to GM food.

Results: - not yet available -

Added Value:

In the view of many scientists, genetic engineering provides avenues with large potential benefits, which may be impeded by public resistance possibly originating from misleading claims easily disseminated through online media. Against this background, this study provides evidence on the effect of emotionally charged disinformation on perceptions of GM food, and ways to effectively correct false claims. In a broader perspective, these results inform further studies and policy interventions on other issues where disinformation loads on strong emotions, ranging from social policy over immigration to health interventions such as vaccinations.



Forwarding Pandemic Online Rumors in Israel and in Wuhan, China

Vered Elishar-Malka1, Shuo Seah2, Dana Weimann-Saks1, Yaron Ariel1, Gabriel Weimann3

1Academic College of Emek Yezreel; 2Huazhong University of Science and Technology, China; 3University of Haifa

Relevance and research question: Starting in the last quarter of 2019, the COVID-19 virus, led to an almost unprecedented global pandemic with severe socioeconomic and political implications and challenges. As in many other large-scale emergencies, the media has played several crucial roles, among them as a channel of rumormongering. Since social media have penetrated our lives, they have become the central platform for spreading and sharing rumors, including about the COVID-19 epidemic. Based on the Theory of Planned Behavior and on the Uses and Gratifications theory, this study explored the factors that affected social media users' willingness to spread pandemic-related rumors in Wuhan, China, and in Israel, via each country's leading social media platform (WeChat and WhatsApp, respectively).

Methods and data: we tested a multi-variant model of factors that influence the forwarding of COVID-19 online rumors. Using an online survey that was simultaneously conducted in both countries between April-May 2020, 415 WeChat and 503 WhatsApp users reported their patterns of exposure to and spread of COVID-19 rumors. As part of the questioner, users were also asked to report on their motives to do so.

Results: The main result was that in Wuhan, personal needs, negative emotions, and the ability to gather information significantly predicted willingness to forward rumors. In contrast, rumors' credibility was found to be a significant predictor in the regression model. In Israel, only the first two predictors, personal needs and negative emotions, were found significant. The best predictor in Wuhan was personal needs, and the best predictor in Israel was negative emotions.

Added value: This study's findings demonstrate the significant roles that WeChat and WhatsApp, the leading social media in China and Israel, respectively, play in local users' lives during a severe national and global crisis. Despite the major differences between the two societies, several interesting similarities were found: in both cases, individual impetuses, shaped by personal needs and degree of negative feelings, were the leading motives behind spreading rumors over social networks. These findings may also help health authorities in planning the right communication strategies during similar situations.



Acceptance or Escape: A Study on the embrace of Correction of Misinformation on YouTube

Junmo Song

Yonsei University, Korea, Republic of (South Korea)

Relevance & Research Question:

YouTube is one of the most important channels for producing and consuming political news in south Korea. YouTube has the characteristic that not only traditional medias, but also new media based on the Internet, or individual news producers, can freely provide news because the platform does not play an active gatekeeper role.

In 2020, North Korea's leader Kim Jong-un's death was reported indiscriminately by both the traditional media and individual channels, but a definite correction was made at the national level. Therefore, this study explores the response to correction by using this case as a kind of natural experiment.

This study aims to analyze the difference in response between producers and audiences when fake information circulating on the YouTube platform is corrected and not. Ultimately, this study seeks to explore the conditions under which correction of misinformation accelerates or alleviates political radicalization.

Methods & Data:

Videos and comments are collected from the top 437 channels in the Politics/News/Social category on YouTube of Korean nationality. Data was collected through the YouTube API provided by Google. Then, classyfing channels into two group that traditional media and new media including individual channel. In addition, the political orientation of comments was classified as progressive/conservative through supervised learning.

Results:

In pilot analysis, In both media and individual channels, the number of comments has generally decreased after correction. In particular, the number of comments in conservative individual channel has drastically decreased.

In addition, after the misinformation was corrected, the difference in political orientation between comments from individual channels and media outlets has significantly decreased or disappeared.

However, existing conservative users did not change their opinions due to the correction of misinformation, and it is observed that they immediately move to other issues and consume.

Added Value:

YouTube has been relatively less analyzed in politics than other platform like SNS, community. This study examines how misinformation is accepted in a political context through the case of Korea in which YouTube has a profound influence on politics.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: GOR 21
Conference Software - ConfTool Pro 2.6.135
© 2001 - 2020 by Dr. H. Weinreich, Hamburg, Germany