Neuroimaging study provides insight into misinformation sharing among politically devoted conservatives
New research suggests that the spread of misinformation among politically devoted conservatives is influenced by identity-driven motives and may be resistant to fact-checks. These individuals tend to prioritize sharing information that aligns with their group identity, regardless of its accuracy. The new research, published in the Journal of Experimental Psychology: General, utilized behavioral tasks and neuroimaging to understand the underlying processes involved.
Social media has become a major source of news for many adults, but malicious agents are using such platforms to spread misinformation to larger audiences faster than ever before. Online misinformation can have serious real-world consequences, such as fueling political polarization, threatening democracy, and reducing vaccination intentions. Thus, the researchers wanted to understand the psychological processes behind the sharing of misinformation and explore potential interventions to counteract its spread.
“In the past, I had been working on extremism and ‘will to fight’ among supporters of Salafi-jihadist groups. Even though I found those groups very interesting to study, there was a cross-cultural barrier that made it hard for me to have intuitions about where they came from and what motivated them,” said study author Clara Pretus, an assistant professor at the Universitat Autònoma de Barcelona and principal investigator of the Social Brain Lab.
“Thus, I decided to leverage my knowledge of extremism to study far-right supporters. This is a group that I am more familiar with given that we share the same national and cultural context. Therefore, it’s easier for me to generate hypotheses about what their motives are and predict how they will respond to a given experimental condition. Far-right parties are also on the rise across Europe, which makes the question of what drives people to support these parties more pressing in the current political context.”
Pretus and her colleagues conducted a series of three experiments to investigate how political devotion, specifically in terms of sacred values and identity fusion, influences the spread of misinformation among conservative partisans.
In Experiment 1, the researchers recruited Spanish far-right voters and center-right voters through an online panel. They asked participants to rate the likelihood of sharing social media posts related to conservative sacred values (immigration, nationalism, and family values) and nonsacred values (roads and infrastructure, foreign affairs, and waste management). The posts were designed to look like real tweets and included critiques of the current liberal government. Participants were divided into three groups, each exposed to a different fact-check in an experimental block: the Twitter fact-check, an accuracy-based fact-check, or a media literacy-based fact-check.
The Twitter fact-check was designed to mimic the fact-checking labels that Twitter applies to certain tweets that contain disputed or misleading information. The label read, “This claim about… is disputed.” The accuracy-based fact-check involved a straightforward question about the truthfulness of the social media posts. The media literacy-based fact-check aimed to stimulate participants’ critical thinking about the techniques used to attract attention in social media posts.
The researchers assessed the participants’ value sacredness and identity fusion with their respective political parties. They also measured participants’ analytical thinking styles, scientific curiosity, intellectual humility, and media literacy.
The researchers found that far-right voters were more likely to share misinformation than center-right voters, especially when the misinformation was relevant to their sacred values. Identity fusion with a political party also predicted a higher likelihood of sharing misinformation, regardless of whether the content was related to sacred or nonsacred issues. However, popular interventions like fact-checks and accuracy nudges did not significantly reduce the likelihood of sharing misinformation among either far-right or center-right voters in Spain.
In Experiment 2, the researchers sought to replicate the effects of sacred values and identity fusion on sharing misinformation, this time among Republicans in the United States. They recruited participants who had voted for Republican Donald J. Trump in the previous two presidential elections. Similar to Experiment 1, the participants rated the likelihood of sharing social media posts related to sacred and nonsacred values. They were exposed to the Twitter fact-check in an experimental block, and half of the participants did not see any fact-checks (control group). The researchers also measured value sacredness, identity fusion with the Republican party, and identity fusion with Trump specifically.
The researchers found that Republicans fused with Trump were more likely to share misinformation than other Republicans, particularly when the misinformation was related to sacred values. Identity fusion with the Republican party also predicted a higher likelihood of sharing misinformation, regardless of the values involved.
The Twitter fact-check had a small but significant effect in reducing the likelihood of sharing misinformation among Republicans, but it did not have an effect on Republicans fused with Trump. These participants showed resistance to interventions against misinformation, even when the misinformation was perceived as implausible by other Republicans.
“One of the main accounts of why people share misinformation is that they don’t pay attention to information accuracy,” Pretus told PsyPost. “In this set of experiments, we find that this isn’t the case for conservatives and far-right supporters, who are responsible for sharing most misinformation on the internet. Specifically, we find that conservatives and far-right supporters in Spain, as well as Republicans in the United States, are more likely to share political messages that appeal to core partisan values (e.g., immigration) and are resistant to different fact-checking strategies. This means that misinformation sharing is driven by partisan motives among these populations and it’s therefore hard to counteract using available interventions against misinformation such as accuracy nudges.”
In Experiment 3, the researchers aimed to understand the neural activity and functional connectivity underlying the spread of misinformation related to sacred values among far-right partisans using social media. They recruited 36 far-right partisans who supported the party “Vox” and were 18 years or older. Participants underwent functional magnetic resonance imaging (fMRI) while completing a task where they had to rate the likelihood of sharing social media posts related to sacred and nonsacred values.
During the fMRI task, participants were shown social media posts (simplified versions of those used in previous experiments) and were asked to indicate how likely they would be to share each post. The posts were related to either sacred values (e.g., immigration) or nonsacred values. Some of the posts also included fact-checks, specifically the Twitter fact-check used in previous experiments.
The findings revealed that when participants were exposed to misinformation relevant to sacred values, certain brain regions were more active compared to when they were exposed to misinformation related to nonsacred values. These brain regions included the dorsomedial prefrontal cortex, bilateral inferior frontal cortex, and precuneus. These areas are associated with theory of mind (understanding others’ mental states) and social norm compliance, which may play a role in how people evaluate and respond to in-group messages that appeal to sacred values.
“When it comes to sharing messages on core partisan values, we also find widespread neural activity in brain regions that help people guess others’ mental states and respond in line with group norms,” Pretus told PsyPost. “This suggests that sharing misinformation on core partisan values has an important social signaling function, that allows group members to show others that they belong. Therefore, it’s critical for group members to be socially accurate in their sharing behavior so that they are in tune with the group.”
Moreover, a functional connectivity analysis showed stronger communication between brain networks involved in cognitive control and social cognition when participants engaged with misinformation related to sacred values.
“I was surprised at the widespread brain activation we found among far-right supporters when exposed to political messages on core partisan values compared to other partisan values,” Pretus explained. “When comparing such closely related experimental conditions in a functional neuroimaging study, such as messages on core partisan values vs. other partisan values, the resulting brain activity is often very localized and minor.
“Here, we find very strong and widespread activation across neural networks that help people manage social relationships. This suggests that decisions involving core partisan values fulfill a very important socializing function, which could have an adaptive value in the primitive environments where our brains evolved.”
However, Pretus noted that “one of the main limitations is that we did not have a control group in the neuroimaging study. Therefore, we don’t know if the brain response to sharing messages on partisan core values is unique to far-right supporters or we could maybe also find it among far-left supporters, or even just among any type of partisans dealing with partisan core values.”
The study, “The Role of Political Devotion in Sharing Partisan Misinformation and Resistance to Fact-Checking“, was authored by Clara Pretus, Camila Servin-Barthet, Elizabeth A. Harris, William J. Brady, Oscar Vilarroya, and Jay J. Van Bavel.
2023/07/30
© PsyPost
No comments:
Post a Comment