Thursday, November 30, 2023

 

Climate: why disinformation is so persistent


A UNIGE team has tested six psychological interventions to combat climate misinformation. It shows how difficult it is to combat these messages, which are resistant to scientific information


Peer-Reviewed Publication

UNIVERSITÉ DE GENÈVE




Melting of glaciers, rising sea levels, extreme heat waves: the consequences of climate change are more visible than ever, and the scientific community has confirmed that humans are responsible. Yet studies show that a third of the population still doubts or disputes these facts. The cause is disinformation spread by certain vested interests. To try and prevent this phenomenon, a team from the University of Geneva (UNIGE) has developed and tested six psychological interventions on nearly 7,000 participants from twelve countries. The research, published in the journal Nature Human Behavior, highlights the extremely persuasive nature of disinformation and the need to strengthen our efforts to combat it.


Fighting disinformation about climate change is a major challenge for society. Although scientific consensus on human responsibility - reaffirmed by the sixth report of the Intergovernmental Panel on Climate Change (IPCC) - has been in place for decades, a third of the population still doubts or disputes it. This phenomenon can be explained by the disinformation spread by certain companies and lobbies over the last 50 years.


‘‘For instance, these messages can take the form of an unfounded questioning of the scientific consensus or an overestimation of the socio-financial burden of climate policies,’’ explains Tobia Spampatti, a PhD Student and Teaching and Research Assistant in the Consumer Decision and Sustainable Behavior Lab (CDSB Lab) at the Faculty of Psychology and Educational Sciences and at the Swiss Center for Affective Sciences of the UNIGE.


Many psychological factors

This phenomenon weakens the support of a part of the population for climate policies. To combat this, Tobia Spampatti and researchers from the UNIGE developed a theoretical framework to describe the formation and updating of (anti)scientific information. This framework, built on previous theoretical takes on the psychology of misinformation (Philippe Mueller et al. and Ulrich Ecker et al. in 2022), takes into account the source of the message, its content, its recipients, and the psychological factors that can influence their processing. This theoretical framework aims to identify the entry points for disinformation to access a person’s ‘‘psyche’’, and can be used to intervene and block, or encourage, people to accept information.


‘‘As individuals, we do not process scientific messages as neutral receivers of information, but by weighing them up against our prior beliefs, desired outcomes, emotional ties and socio-cultural and ideological backgrounds. Depending on the configuration of these psychological factors, anti-scientific beliefs can be amplified and become resistant to correction,’’ explains Tobia Spampatti, first author of the study.


Six preventive strategies put to the test

On this basis, the researchers developed six psychological intervention strategies aimed at preventing climate disinformation from affecting people’s climate-related beliefs and behaviors. They were tested on 6,816 participants in twelve different countries. Each strategy was linked to a particular theme (scientific consensus, trust in climate scientists, transparent communication, moralizing climate action, accuracy, positive emotions towards climate action). The participants were divided into eight groups: six subjected to one of these strategies, one to disinformation without prevention, and a control group.


The ‘‘trust in climate scientists’’ group, for example, received verified information demonstrating the credibility of IPCC scientists. The “transparent communication” group, meanwhile, was presented with information on both the advantages and the disadvantages of climate mitigation actions. Each group was then exposed to twenty pieces of false or biased information, ten on climate science and ten on climate policy. The UNIGE scientists then measured their impact after these preventive interventions by asking the participants about their feelings regarding climate mitigation actions.


Low preventive effect

‘‘We found that the protective effect of our strategies is small and disappears after the second exposure to disinformation. Climate disinformation used in this study has a negative influence on people’s belief in climate change and their sustainable behaviour’’, says Tobias Brosch, Associate Professor in the CDSB Lab at the Faculty of Psychology and Educational Sciences and at the Swiss Center for Affective Sciences in the UNIGE, and final author of the study. ‘‘Disinformation is therefore extremely persuasive, seemingly more so than scientific information. Only the ‘accuracy’ group, who were asked to think in depth about the accuracy of the information they encountered online, showed a slight advantage’’.


‘‘Research in this field is still in its infancy. We are therefore going to continue our work and look for more effective forms of intervention. It is becoming increasingly urgent to combat this phenomenon, which is delaying the implementation of certain urgent climate change mitigation measures,’’ concludes Tobia Spampatti.

JOURNAL

DOI

METHOD OF RESEARCH

SUBJECT OF RESEARCH

ARTICLE TITLE

ARTICLE PUBLICATION DATE

Psychological science can help counter spread of misinformation, says APA report


Details systemic and individual strategies

Reports and Proceedings

AMERICAN PSYCHOLOGICAL ASSOCIATION




WASHINGTON – Debunking, “prebunking,” nudging and teaching digital literacy are several of the more effective ways to counter misinformation, according to a new report from the American Psychological Association.

Written by a panel of U.S. and international experts on the psychology of misinformation, the report outlines the processes that make people susceptible to misinformation and offers solutions to combat it.

People are more likely to believe misinformation if it comes from groups they belong to or if they judge the source as credible, according to the report “Using Psychological Science to Understand and Fight Health Misinformation: An APA Consensus Statement.” It defines misinformation as “any information that is demonstrably false or otherwise misleading, regardless of its source or intention.”

The report outlines the key features of misinformation that fool people into believing and spreading it. For instance, it found that people are more likely to believe false statements that appeal to emotions such as fear and outrage. They are also more likely to believe misinformation that paints groups that they view as “others” in a negative light. And people are more likely to believe information the more it is repeated, even when it contradicts their prior knowledge. These findings suggest that it is important to stop misinformation early, the report says.

The report also describes features of social media that help misinformation spread very quickly. “Rapid publication and peer-to-peer sharing allow ordinary users to distribute information quickly to large audiences, so misinformation can be policed only after the fact (if at all),” the report says. “’Echo chambers’ bind and isolate online communities with similar views, which aids the spread of falsehoods and impedes the spread of factual corrections.” 

As a result, “most online misinformation originates from a small minority of ‘superspreaders,’ but social media amplifies their reach and influence.”

There are two levels on which misinformation can be stopped, according to the report: systemic approaches, such as legislation and technology standards, and individual approaches focused on changing individual behaviors. The latter include: 

  • fact-checking, or debunking; 
  • prebunking, or pre-emptive debunking to prevent people from falling for misinformation in the first place; 
  • nudges, such as asking people to consider the accuracy of information before sharing it, or rewarding people to be as accurate as possible; 
  • and formal education or community outreach to raise people’s awareness about healthy online behavior and media use.

The report acknowledges that there is much more to learn and recommends more research funding and industry cooperation to understand behaviors related to misinformation and create tools to correct it. The panel members who wrote the report spent more than a year reviewing the scientific literature to develop their recommendations. The report was commissioned by the Centers for Disease Control and Prevention and funded as part of a $2 million grant to develop effective solutions to COVID-19 vaccine hesitancy. 

While the panel’s recommendations focus on health misinformation, they can also be used for broader topics such as politics and climate change. For instance, these findings offer direct input to one of the main issues highlighted in APA’s Health Advisory on Social Media by addressing tactics that can be used to combat misinformation.  

The report recommends eight steps for policymakers, scientists, media and the public to help curb the spread of misinformation and the risks it poses to health, well-being and civic life: 

  1. Avoid repeating misinformation without including a correction.
  2. Collaborate with social media companies to understand and reduce the spread of harmful misinformation.
  3. Use misinformation correction strategies with tools already proven to promote healthy behaviors.
  4. Leverage trusted sources to counter misinformation and provide accurate health information.
  5. Debunk misinformation often and repeatedly using evidence-based methods.
  6. Prebunk misinformation to inoculate susceptible audiences by building skills and resilience from an early age.
  7. Demand data access and transparency from social media companies for scientific research on misinformation.
  8. Fund basic and translational research into the psychology of health misinformation, including ways to counter it.

“These psychological science findings help to explain how misinformation enters our thought processes,” the report states. “It is effortful and difficult for our brains to apply existing knowledge when encountering new information; when new claims are false but sufficiently reasonable, we can learn them as facts. Thus, everyone is susceptible to misinformation to some degree: we acquire it even when we know better.”

The American Psychological Association, in Washington, D.C., is the largest scientific and professional organization representing psychology in the United States. APA’s membership includes over 146,000 researchers, educators, clinicians, consultants and students. Through its divisions in 54 subfields of psychology and affiliations with 60 state, territorial and Canadian provincial associations, APA works to advance the creation, communication and application of psychological knowledge to benefit society and improve people’s lives.


No comments:

Post a Comment