Showing posts sorted by relevance for query Conspiracy. Sort by date Show all posts
Showing posts sorted by relevance for query Conspiracy. Sort by date Show all posts

Tuesday, July 18, 2023

Positive contact with diverse groups can reduce belief in conspiracy theories about them


Peer-Reviewed Publication

UNIVERSITY OF NOTTINGHAM



New research has shown that having positive contact with people from diverse groups can reduce the development of harmful intergroup conspiracy beliefs.

Experts from the University of Nottingham’s School of Psychology, in collaboration with the University of East Anglia, found that among British participants, positive intergroup contact interfered with the development of conspiracy theories about other groups. The findings have been published today in the European Journal of Social Psychology.

Social psychologist Dr Daniel Jolley from the University of Nottingham led the research and explains: “Intergroup conspiracy theories are common and potentially can lead to everything from misinformed voting to extreme expressions of prejudice. Seeking ways to reduce conspiracy theories is of particular importance.”

Three studies were conducted with over 1,000 people, where the team explored whether positive intergroup contact interferes with the development of conspiracy theories about other social groups.

The first two studies explored relationships, where British participants were asked about their experience of contact with immigrants (Study 1) or Jewish people (Study 2) and their belief in conspiracy theories in relation to them. In the third study, participants were asked to think about a positive contact experience with a Jewish person and then report their conspiracy beliefs held about this group. Participants also reported their feelings (prejudice) towards the target group in each study.

The research demonstrated that those people who had experienced higher quality positive contact with Jewish people or immigrants or imagined a positive contact experience were less likely to believe conspiracy theories about them. Importantly, these effects remained even when accounting for (negative) feelings towards the target group, demonstrating that the effect is not merely another prejudice reduction effect.

Dr Jolley explains: “The research findings offer a promising potential starting point for developing tools to bring diverse groups of people together who may not usually have contact and try to foster positive conversations to help reduce potentially harmful conspiracy theories from taking hold.

“Whilst the problems are often very complex, and positive contact will not solve all the issues surrounding conspiracy theories towards certain groups, the fact that this work offers a potential tool to reduce intergroup conspiracy theories is a notable breakthrough. Our work offers a framework that, along with future research, might lead to the reduction of conspiracy beliefs in the general population”.

Friday, October 09, 2020

A psychologist explains why people cling to conspiracy theories during uncertain times


John M. Grohol, Psych Central
Conspiracy theories blaming 5G for the coronavirus have exploded online. Cindy Ord/Getty Images

Conspiracy theories frequently surface after traumatic events and during times of uncertainty, such as in the aftermath of mass shootings or during the COVID-19 pandemic.

Far-fetched theories can spread rapidly online and via social media. Recently, a study indicated that nearly one third of Americans believe in a conspiracy theory about the coronavirus, such as one that claims the outbreak is linked to 5G internet.

Dr. John Grohol, a psychologist and the founder of Psych Central, says that conspiracy theorists come up with ideas out of thin air to match whatever 'fact' they think is true, and often use paranoia-based beliefs to convince others.

He says that these people tend to be uncooperative, distrustful, and socially isolated — which is why believing in a conspiracy theory with strangers on the internet can give them a sense of belonging.

Conspiracy theories are as old as time, but it's only in more recent years that psychologists have begun to unravel the belief that some people have in them. According to researcher Goertzel (1994), conspiracy theories are explanations that refer to hidden groups working in secret to achieve sinister objectives.

Dr. John Grohol. Courtesy of John Grohol

Whether it's the killing of a US President (Kennedy), a mass-shooting involving a seemingly-normal older white, adult male (Las Vegas), or the Charlie Hebdo murders, conspiracy theories are never far behind. Even climate change has a conspiracy theory attached to it (the US government is to blame, naturally).

What drives people's belief in these "out there" explanations for significant events? Let's find out.

There is a conspiracy theory that there were two shooters at the Las Vegas massacre, the largest mass-shooting in modern US history. The theory — believed by tens of thousands of people around the world — rests on the "evidence" of two grainy, hard-to-hear videos from eyewitnesses. These videos suggest that somehow a second shooter was able to shoot from the 4th floor of the Mandalay Bay hotel — despite the fact that there were no broken windows on the 4th floor, and police searching the building floor-by-floor heard no such shots.1

What is the purpose of the second shooter? As proof that the official narrative is false, as the second shooter points to some "new world order" plot that is intent on taking over our government and society. Or something like that. The rationale for a second shooter requires a suspension of your belief in reality and simple critical thinking.
The psychology behind conspiracy theories

Researchers have been hard at work examining why a small minority of the population believe, and even thrive, on conspiracy theories.

Lantian et al. (2017) summarize the characteristics associated with a person who is likely to believe in conspiracy theories:

… personality traits such as openness to experience, distrust, low agreeability, and Machiavellianism are associated with conspiracy belief.

"Low agreeability" refers to a trait of "agreeableness," which psychologists define as how much an individual is dependable, kind, and cooperative. Someone with low agreeability is an individual who is usually not very dependable, kind, or cooperative. Machiavellianism refers to a personality trait where a person is so "focused on their own interests they will manipulate, deceive, and exploit others to achieve their goals."

Lantian et al. (2017) continue:

In terms of cognitive processes, people with stronger conspiracy beliefs are more likely to overestimate the likelihood of co-occurring events, to attribute intentionality where it is unlikely to exist, and to have lower levels of analytic thinking.

None of this should be surprising, because once you start to analyze a situation with demonstrable facts, it usually — and quite thoroughly — will break down the conspiracy theory into its component parts, none of which make sense standing on their own. For example, with zero evidence, conspiracy theorists need to invent a reason for a second shooter in Las Vegas, to match what they see as "facts." But once a person starts inventing a narrative out of thin air, you can see very little critical thinking occurring.

Conspiracy theories make a person feel special

Lantian et al.'s (2017) research examined the role of a person's 'need for uniqueness' and a belief of conspiracy theories, and found a correlation.

We argue that people high in need for uniqueness should be more likely than others to endorse conspiracy beliefs because conspiracy theories represent the possession of unconventional and potentially scarce information. […] Moreover, conspiracy theories rely on narratives that refer to secret knowledge (Mason, 2002) or information, which, by definition, is not accessible to everyone, otherwise it would not be a secret and it would be a well-known fact.

People who believe in conspiracy theories can feel "special," in a positive sense, because they may feel that they are more informed than others about important social and political events. […]

Our findings can also be connected to recent research demonstrating that individual narcissism, or a grandiose idea of the self, is positively related to belief in conspiracy theories. Interestingly, Cichocka et al. (2016) found that paranoid thought mediates the relationship between individual narcissism and conspiracy beliefs.

The current work suggests, however, that need for uniqueness could be an additional mediator of this relationship. Indeed, previous work has shown that narcissism is positively correlated with need for uniqueness (Emmons, 1984) and here we showed that need for uniqueness is related to conspiracy belief.
People who believe in conspiracy theories are likely more alienated and socially isolated

Moulding et al. (2016) also dug into the characteristics of people who believe in conspiracy theories in two studies.

It has been noted that individuals who endorse conspiracy theories are likely to be higher in powerlessness, social isolation, and 'anomia,' which is broadly defined as a subjective disengagement from social norms.

Such disengagement from the normative social order may result in greater conspiratorial thinking for a number of related reasons. First, individuals who feel alienated may consequently reject conventional explanations of events, as they reject the legitimacy of the source of these explanations. Due to these individuals feeling alienated from their peers, they may also turn to conspiracist groups for a sense of belonging and community, or to marginalized subcultures in which conspiracy theories are potentially more rife.

People who feel powerless may also endorse conspiracy theories as they also help the individual avoid blame for their predicament. In this sense, conspiracy theories give a sense of meaning, security and control over an unpredictable and dangerous world. Finally, and most simply, conspiracy beliefs — which imply a level of Machiavellianism and power enacted by those without fixed morality — are most likely to resonate with people who feel powerless and believe that society lacks norms.

The Internet has amplified the abilities of these like-minded people to come together to share and expand on their conspiracy theories. It took only hours after the Las Vegas massacre for a conspiracy Facebook group to appear with more than 5,000 members.

In their study, Moulding et al. (2016) found that, consistent with their hypotheses, "endorsement of conspiracy theories related moderately-to-strongly with the alienation-related variables — isolation, powerlessness, normlessness, and disengagement from social norms."

Researcher van Prooijen (2016) also found that self-esteem instability resulting in self-uncertainty also is a characteristic associated with a greater likelihood to believe in conspiracy theories. People who don't feel like they belong to any one group — a trait psychologists refer to as 'belongingness' — are more likely to believe in conspiracy theories.

Conspiracy theories are driven by people, not facts

You can't really argue with people who believe in conspiracy theories, because their beliefs aren't rational. Instead, they are often fear- or paranoia-based beliefs that, when confronted with contrarian factual evidence, will dismiss both the evidence and the messenger who brings it.2 That's because conspiracy theories are driven by the people who believe and spread them and their own psychological makeup — not on the factual support or logical reasoning of the theory itself.

Conspiracy theories aren't going away, for as long as there are people who have a need to believe in them, they will continue to expand and thrive. The Internet and social media sites such as Facebook have only made such theories even easier to spread. Save your breath arguing with people who believe in them, as no amount of facts will dissuade them from their false belief.

Footnotes:
The conspiracy theorists apparently don't realize that all of Mandalay Bay's windows do not open, like in most Vegas hotels. If there was no broken window, there was no way a person could shoot from the 4th floor. And independent police departments as well as individual officers and first-responders suddenly become a part of the whole government conspiracy. []
"Fake news" they'll say, as though that is a rational, mature, and cohesive argument in reply. []


John M. Grohol, PsyD, is the founder and editor-in-chief of PsychCentral.com, a mental health and human behavior technology expert, coauthor of Self-Help That Works, and a published researcher. He sits on the scientific board of the journal, Computers in Human Behavior, and is a founding board member of the Society for Participatory Medicine.

This piece was originally published on Business Insider April 27, 2020, and is republished with permission from PsychCentral.com. All rights reserved. Read the original article here. This piece was last updated June 8, 2018.Read the original article on Psych Central. Copyright 2020.


READ MORE: Bill Gates points to social media as the reason coronavirus conspiracy theories spread so rapidly: 'Incorrect things that are very titillating can spread very rapidly compared to the truth'

Thursday, December 19, 2019


What conspiracy theories have in common with fiction – and why it makes them compelling stories
Storytelling: the power of a good narrative can be persuasive. Credit: Billion Photos/Shutterstock
In an era dominated by "fake news" and disinformation, conspiracy theories are coming to play an increasingly influential role in modern politics. During the recent impeachment hearings in the US, for example, former National Security Council official Fiona Hill warned that "fictional narratives" pushed by Russia were undermining American security.

But what's the difference exactly between a conspiracy theory and a legitimate news story? Does "fictional" in this sense simply mean fabricated? My ongoing research suggests there is more to it than this—something which can explain why conspiracy theories can gain such a powerful hold over the public imagination.
The narrative that Hill was referring to in her impeachment testimony is what's known as "Crowdstrike", a conspiracy theory named after a US cybersecurity company, that alleges it was Ukraine rather than Russia that hacked the Democratic National Committee's email server in 2016, and that Ukraine, along with the Democrats, subsequently went about framing Russia for interfering in the election.
A day after Hill's testimony the US president, Donald Trump, again trotted out precisely these same allegations in an interview with the TV show Fox & Friends. In doing so he made a string of assertions which are provably false. Reports from both the US intelligence community and special counsel Robert Mueller have, after all, concluded that it was Russia who actively interfered in the 2016 election, while there's no evidence of Ukraine having any part in it.
As Hill noted, the whole Crowdstrike theory seems to be a clear "effort to legitimise an alternative narrative that the Ukrainian government is a US adversary, and that Ukraine—not Russia—attacked us in 2016".
Powerful forms of narrative
Conspiracy theories are used in disinformation campaigns in two main ways. On the one hand, the simple act of citing them can be a way of legitimising views you don't like. For instance, the British journalist Carole Cadwalladr's investigations into various shady tactics used by the Leave campaign in 2016 EU referendum are regularly dismissed as nothing more than conspiracies by her enemies.
But conspiracy theories are also used as counter-narratives to confuse the actual nature of events and, in doing so, push a particular ideological view of the world.
It's worth noting that all explanations operate as a type of narrative. A basic dramatic narrative has three steps to it: (1) a person embarks upon a (2) journey into a hostile environment which (3) ultimately leads to self-knowledge.
This same basic structure applies to explanations: (1) you want to discover some information; (2) you find a way of discovering it; and (3) your world is changed as a result.
But, as recent research I've been doing shows, there are several ways in which conspiracy theories draw directly on elements of storytelling that are found in fiction rather than factual narratives.
As in fictional narratives, all the elements in a conspiracy  are linked through clear lines of cause and effect. There's a reason for everything and, if that reason isn't immediately forthcoming, it's because it's being purposefully hidden as part of the conspiracy. This differs from real life of course, where events often include large amounts of happenstance, inexplicable phenomena and a general murkiness and confusion.
Same story
Then there's the way that  theories are all underpinned by the same basic archetype: what the writer Christopher Booker calls the "overcoming the monster" story. In this, a single or a small group of rebels take on the overwhelming forces of a corrupt and malevolent establishment which is threatening the wellbeing of society.
Crowdstrike slots snuggly into this formula. Corrupt forces within the political establishment (in this case the Democratic Party) are presented as betraying the will of the people—represented by the election of Trump in 2016. The ongoing impeachment process against the president therefore threatens the welfare of the US as an independent democratic nation. As the political theorist Jan-Werner Muller has noted, this type of  is structurally embedded in the logic of all populist movements in the way their leaders regularly argue that the will of the people can only be denied through underhand and corrupt ways.
Conspiracy theories always fixate on a very simple story which acts as a fable for their overarching worldview. They usually take an issue of real significance—such as foreign influence in domestic elections—but, in order to explain it, they latch on to one succinct story which bypasses the complexities and messiness of real-life phenomena and instead satisfies the logic of their overarching ideological narrative.
For Trump's supporters, the Crowdstrike story feels true because it's another example of the establishment's great witch hunt against him. As a story, it also has a coherent logic which the expanse and messiness of the facts lack. So, in both these ways, our familiarity with the way the world is mediated via fiction helps cast doubt on the way the world actually is.

Study shows there's nothing wacky about conspiracy theorists


Study shows there’s nothing wacky about conspiracy theorists »
Credit: Pixabay
Researchers at The Australian National University (ANU) have delved into the world of online conspiracy theories, showing most of the people behind them are actually pretty ordinary.

The study looked at eight years of content, sifting through more than two billion comments posted on Reddit, including everything posted to the subreddit r/.
R/conspiracy covers everything from UFOs and 9/11, to political conspiracy theories like 'pizzagate', which took off during the 2016 US election campaign.
But despite the subject matter, lead author of the study Dr. Colin Klein says conspiracy theorists aren't always a bunch of "crackpots wearing tinfoil hats".
"In the past before the rise of online forums like Reddit, we tended to only hear about the most , and those people tended to naturally be wary about talking to someone else about their beliefs," Dr. Klein said.
"These massive online forums paint a very different picture.
"The enormous set of comments we examined show many r/conspiracy users actually have more 'sensible' interests.
"For example conspiracy theories about police abuse of power are common. That's not so crazy.
"These people might believe false things, but with good reason—because similar things have happened in the past."
Professor Klein and his team also found that while there are subtle differences in the language used by those who end up posting on r/conspiracy, it's not necessarily enough to set them apart from other Reddit users.
"You might find they talk more about power or power structures, but their language is not that different from what ordinarily goes on in a  like r/politics. You can't distinguish them that way.
"It's very easy to look at conspiracy theories and think they're super wacky, and the people who believe in them are crazy, but it's actually much more continuous with a lot of things we do every day.
"Low level theorising goes on a lot in , I'm inclined to think the stuff you see online is just a strong outgrowth of that."
According to Dr. Klein, forums like r/conspiracy can also be driven by current events.
"For example, Reddit attracted a whole new set of users following the election of US President Donald Trump.
"He also generates quite a lot of in-fighting amongst users. This is what makes it such great way to study social dynamics."
The data also reveals how people come to start posting on the r/conspiracy forum. The rise of Internet echo chambers is a factor—but there's much more at play.
"We followed people who started using Reddit and posted for about six months before they ended up on r/conspiracy," Dr. Klein said.
"You find two people who, for example, both started on the popular 'ask me anything' Reddit, and one ends up talking about conspiracies and one doesn't.
"People who go on to post on r/conspiracy also tend to be over-represented in the political forums, but it's not like they're hyper-focused.
"This suggests a more active process where people are seeking out sympathetic communities. This process of finding like-minded people is something we see a lot of on the Internet."

The research has been published in the journal PLOS ONE.

Conspiracy theorists actively seek out their online communities


Don't (just) blame echo chambers. Conspiracy theorists actively seek out their online communities
The term illuminati has been used since the late 15th century, and applied to various groups since then. It’s often discussed by conspiracy theorists, and is heavily referenced in pop-culture. Credit: Lettuce./FlickrCC BY-NC-ND
Why do people believe conspiracy theories? Is it because of who they are, what they've encountered, or a combination of both?

The answer is important. Belief in  theories helps fuel climate change denial, anti-vaccination stances, racism, and distrust of the media and science.
In a paper published today, we shed light on the online world of conspiracy theorists, by studying a large set of user comments.
Our key findings are that people who eventually engage with conspiracy forums differ from those who don't in both where and what they post. The patterns of difference suggest they actively seek out sympathetic communities, rather than passively stumbling into problematic beliefs.
We looked at eight years of comments posted on the popular website Reddit, a platform hosting millions of individual forums called subreddits.
Our aim was to find out the main differences between users who post in r/conspiracy (a subreddit dedicated to conspiracy theories) and other Reddit users.
Using a technique called sentiment analysis we examined what users said, and where they said it, during the months before their first post in r/conspiracy.
We compared these posts to those of other users who started posting on Reddit at the same time, and in the same subreddits, but without going on to post in r/conspiracy.
We then constructed a network of the subreddits through which r/conspiracy posters traveled. In doing so, we were able to discover how and why they reached their destination.
Seeking the like-minded
Our research suggests there is evidence for the "self-selection" of conspiracy theorists. This means users appear to be seeking communities of people who share their views.
Users followed clear pathways to eventually reach r/conspiracy.
For example, these users were over-represented in subreddits focused on politics, drugs and internet culture, and engaged with such topics more often than their matched pairs.
We were also surprised by the diversity of pathways taken to get to r/conspiracy. The users were not as concentrated on one side of the political spectrum as people might expect. Nor did we find more anxiety in their posts, compared with other users.
Our previous research also indicated online conspiracy theorists are more diverse and ordinary than most people assume.
Where do the beliefs come from?
To dig deeper, we examined the interactions between where and what r/conspiracy users posted.
In political subreddits, the language used by them and their matched pairs was quite similar. However, in Reddit's very popular general-purpose subreddits, the linguistic differences between the two groups were striking.
So far, psychologists, sociologists, and philosophers have struggled to find anything distinct about conspiracy believers or their environments.
Social media can play a role in spreading conspiracy theories, but it mostly entrenches beliefs among those who already have them. Thus it can be challenging to measure and understand how conspiracy beliefs arise.
Traditional survey and interview approaches don't always give reliable responses. This is because conspiracy theorists often frame their life in narratives of conversation and awakening, which can obscure the more complex origins of their beliefs.
Furthermore, as philosopher David Coady pointed out, some  turn out to be true. Insiders do sometimes uncover evidence of malfeasance and cover-ups, as recent debates over the need for whistleblower protections in Australia reflect.
Echo chambers worsen the problem
Research about online radicalization from philosophy has focused on the passive effects of technologies such as recommended algorithms and their role in creating online echo chambers.
Our research instead suggests individuals seem to have a more active role in finding like-minded communities, before their interactions in such communities reinforce their beliefs.
These "person-situation interactions" are clearly important and under-theorized.
As the psychologist David C. Funder puts it: "Individuals do not just passively find themselves in the situations of their lives; they often actively seek and choose them. Thus, while a certain kind of bar may tend to generate a situation that creates fights around closing time, only a certain kind of person will choose to go to that kind of bar in the first place."
We suspect a similar process leads users to conspiracy forums.
A complex web of interactions
Our data indicates that conspiracy beliefs, like most beliefs, are not adopted in a vacuum. They are actively mulled over, discussed, and sought out by agents in a social (and increasingly online) world.
And when forums like 8chan and Stormfront are pushed offline, users often look for other ways to communicate.
These complex interactions are growing in number, and technology can amplify their effects.
YouTube radicalization, for example, is likely driven by interactions between algorithms and self-selected communities.
When it comes to conspiracy beliefs, more work needs to be done to understand the interplay between a person's social environment and their information seeking behavior.
And this becomes even more pressing as we learn more about the risks that come with conspiracy theorizing.

More information: Colin Klein et al. Pathways to conspiracy: The social and linguistic precursors of involvement in Reddit's conspiracy theory forum, PLOS ONE (2019). DOI: 10.1371/journal.pone.0225098
Journal information: PLoS ONE 

Thursday, September 12, 2024

 

An exit for even the deepest rabbit holes: Personalized conversations with chatbot reduce belief in conspiracy theories



Summary author: Walter Beckwith


American Association for the Advancement of Science (AAAS)





Personalized conversations with a trained artificial intelligence (AI) chatbot can reduce belief in conspiracy theories – even in the most obdurate individuals – according to a new study. The findings, which challenge the idea that such beliefs are impervious to change, point to a new tool for combating misinformation. “It has become almost a truism that people ‘down the rabbit hole’ of conspiracy belief are almost impossible to reach,” write the authors. “In contrast to this pessimistic view, we [show] that a relatively brief conversation with a generative AI model can produce a large and lasting decrease in conspiracy beliefs, even among people whose beliefs are deeply entrenched.” Conspiracy theories – beliefs that some secret but influential malevolent organization is responsible for an event or phenomenon – are notoriously persistent and pose a serious threat to democratic societies. Yet despite their implausibility, a large fraction of the global population has come to believe in them, including as much as 50% of the United States population by some estimates. The persistent belief in conspiracy theories despite clear counterevidence is often explained by social-psychological processes that fulfill psychological needs and by the motivation to maintain identity and group memberships.  Current interventions to debunk conspiracies among existing believers are largely ineffective.

 

Thomas Costello and colleagues investigated whether Large Language Models (LLMs) like GPT-4 Turbo can effectively debunk conspiracy theories by using their vast information access and by using tailored counterarguments that respond directly to specific evidence presented by believers. In a series of experiments encompassing 2,190 conspiracy believers, participants engaged in several personalized interactions with an LLM, sharing their conspiratorial beliefs and the evidence they felt supported them. In turn, the LLM responded by directly refuting these claims through tailored, factual and evidence-based counterarguments. A professional fact-checker hired to evaluate the accuracy of the claims made by GPT-4 Turbo reported that, of these claims, 99.2% were rated as “true,” 0.8% as “misleading,” and 0 as “false”; and none were found to contain liberal or conservative bias. Costello et al. found that these AI-driven dialogues reduced participants’ misinformed beliefs by an average of 20%. This effect lasted for at least 2 months and was observed across various unrelated conspiracy theories, as well as across demographic categories. According to the authors, the findings challenge the idea that evidence and arguments are ineffective once someone has adopted a conspiracy theory. They also question social-psychological theories that focus on psychological needs and motivations as the main drivers of conspiracy beliefs. “For better or worse, AI is set to profoundly change our culture,” write Bence Bago and Jean-François Bonnefon in a related Perspective. “Although widely criticized as a force multiplier for misinformation, the study by Costello et al. demonstrates a potential positive application of generative AI’s persuasive power.”

 

A version of the chatbot referenced in this paper can be visited at https://www.debunkbot.com/conspiracies.

 

***A related embargoed news briefing was held on Tuesday, 10 September, as a Zoom Webinar. Recordings can be found at the following links:

The passcode for both is &M67bgdd

Can AI talk us out of conspiracy theories?


New MIT Sloan research shows that conversations with large language models can successfully reduce belief in conspiracy theories



Peer-Reviewed Publication

MIT Sloan School of Management




Have you ever tried to convince a conspiracy theorist that the moon landing wasn’t staged? You likely didn’t succeed, but ChatGPT might have better luck, according to research by MIT Sloan School of Management professor David Rand and American University professor of psychology Thomas Costello, who conducted the research during his postdoctoral position at MIT Sloan.

In a new paper “Durably reducing conspiracy beliefs through dialogues with AI” published in Science, the researchers show that large language models can effectively reduce individuals’ beliefs in conspiracy theories — and that these reductions last for at least 2 months — a finding that offers new insights into the psychological mechanisms behind the phenomenon as well as potential tools to fight the spread of conspiracies.

Going down the rabbit hole

Conspiracy theories — beliefs that certain events are the result of secret plots by influential actors — have long been a subject of fascination and concern. Their persistence in the face of counter-evidence has led to the conclusion that they fulfill deep-seated psychological needs, rendering them impervious to facts and logic. According to this conventional wisdom, once someone “falls down the rabbit hole,” it’s virtually impossible to pull them back out.

But for Rand, Costello, and their co-author professor Gordon Pennycook from Cornell University, who have conducted extensive research on the spread and uptake of misinformation, that conclusion didn’t ring true. Instead, they suspected a simpler explanation was at play.

“We wondered if it was possible that people simply hadn’t been exposed to compelling evidence disproving their theories,” Rand explained. “Conspiracy theories come in many varieties — the specifics of the theory and the arguments used to support it differ from believer to believer. So if you are trying to disprove the conspiracy but haven’t heard these particular arguments, you won’t be prepared to rebut them.”

Effectively debunking conspiracy theories, in other words, would require two things: personalized arguments and access to vast quantities of information — both now readily available through generative AI.

Conspiracy conversations with GPT4

To test their theory, Costello, Pennycook, and Rand harnessed the power of GPT-4 Turbo, OpenAI’s most advanced large language model, to engage over 2,000 conspiracy believers in personalized, evidence-based dialogues.

The study employed a unique methodology that allowed for deep engagement with participants' individual beliefs. Participants were first asked to identify and describe a conspiracy theory they believed in using their own words, along with the evidence supporting their belief.

GPT-4 Turbo then used this information to generate a personalized summary of the participant's belief and initiate a dialogue. The AI was instructed to persuade users that their beliefs were untrue, adapting its strategy based on each participant’s unique arguments and evidence.

These conversations, lasting an average of 8.4 minutes, allowed the AI to directly address and refute the specific evidence supporting each individual’s conspiratorial beliefs, an approach that was impossible to test at scale prior to the technology’s development.

A significant — and durable — effect

The results of the intervention were striking. On average, the AI conversations reduced the average participant's belief in their chosen conspiracy theory by about 20%, and about 1 in 4 participants — all of whom believed the conspiracy beforehand — disavowed the conspiracy after the conversation. This impact proved durable, with the effect remaining undiminished even two months post-conversation.

The AI conversation’s effectiveness was not limited to specific types of conspiracy theories. It successfully challenged beliefs across a wide spectrum, including conspiracies that potentially hold strong political and social salience, like those involving COVID-19 and fraud during the 2020 U.S. presidential election.

While the intervention was less successful among participants who reported that the conspiracy was central to their worldview, it did still have an impact, with little variance across demographic groups.

Notably, the impact of the AI dialogues extended beyond mere changes in belief. Participants also demonstrated shifts in their behavioral intentions related to conspiracy theories. They reported being more likely to unfollow people espousing conspiracy theories online, and more willing to engage in conversations challenging those conspiratorial beliefs.

The opportunities and dangers of AI

Costello, Pennycook, and Rand are careful to point to the need for continued responsible AI deployment since the technology could potentially be used to convince users to believe in conspiracies as well as to abandon them.

Nevertheless, the potential for positive applications of AI to reduce belief in conspiracies is significant. For example, AI tools could be integrated into search engines to offer accurate information to users searching for conspiracy-related terms.

“This research indicates that evidence matters much more than we thought it did — so long as it is actually related to people’s beliefs,” Pennycook said. “This has implications far beyond just conspiracy theories: Any number of beliefs based on poor evidence could, in theory, be undermined using this approach.”

Beyond the specific findings of the study, its methodology also highlights the ways in which large language models could revolutionize social science research, said Costello, who noted that the researchers used GPT-4 Turbo to not only conduct conversations but also to screen respondents and analyze data.

“Psychology research used to depend on graduate students interviewing or conducting interventions on other students, which was inherently limiting,” Costello said. “Then, we moved to online survey and interview platforms that gave us scale but took away the nuance. Using artificial intelligence allows us to have both.”

These findings fundamentally challenge the notion that conspiracy believers are beyond the reach of reason. Instead, they suggest that many are open to changing their views when presented with compelling and personalized counter-evidence.

“Before we had access to AI, conspiracy research was largely observation and correlational, which led to theories about conspiracies filling psychological needs,” said Costello. “Our explanation is more mundane — much of the time, people just didn’t have the right information.”