Thursday, December 19, 2019


What conspiracy theories have in common with fiction – and why it makes them compelling stories
Storytelling: the power of a good narrative can be persuasive. Credit: Billion Photos/Shutterstock
In an era dominated by "fake news" and disinformation, conspiracy theories are coming to play an increasingly influential role in modern politics. During the recent impeachment hearings in the US, for example, former National Security Council official Fiona Hill warned that "fictional narratives" pushed by Russia were undermining American security.

But what's the difference exactly between a conspiracy theory and a legitimate news story? Does "fictional" in this sense simply mean fabricated? My ongoing research suggests there is more to it than this—something which can explain why conspiracy theories can gain such a powerful hold over the public imagination.
The narrative that Hill was referring to in her impeachment testimony is what's known as "Crowdstrike", a conspiracy theory named after a US cybersecurity company, that alleges it was Ukraine rather than Russia that hacked the Democratic National Committee's email server in 2016, and that Ukraine, along with the Democrats, subsequently went about framing Russia for interfering in the election.
A day after Hill's testimony the US president, Donald Trump, again trotted out precisely these same allegations in an interview with the TV show Fox & Friends. In doing so he made a string of assertions which are provably false. Reports from both the US intelligence community and special counsel Robert Mueller have, after all, concluded that it was Russia who actively interfered in the 2016 election, while there's no evidence of Ukraine having any part in it.
As Hill noted, the whole Crowdstrike theory seems to be a clear "effort to legitimise an alternative narrative that the Ukrainian government is a US adversary, and that Ukraine—not Russia—attacked us in 2016".
Powerful forms of narrative
Conspiracy theories are used in disinformation campaigns in two main ways. On the one hand, the simple act of citing them can be a way of legitimising views you don't like. For instance, the British journalist Carole Cadwalladr's investigations into various shady tactics used by the Leave campaign in 2016 EU referendum are regularly dismissed as nothing more than conspiracies by her enemies.
But conspiracy theories are also used as counter-narratives to confuse the actual nature of events and, in doing so, push a particular ideological view of the world.
It's worth noting that all explanations operate as a type of narrative. A basic dramatic narrative has three steps to it: (1) a person embarks upon a (2) journey into a hostile environment which (3) ultimately leads to self-knowledge.
This same basic structure applies to explanations: (1) you want to discover some information; (2) you find a way of discovering it; and (3) your world is changed as a result.
But, as recent research I've been doing shows, there are several ways in which conspiracy theories draw directly on elements of storytelling that are found in fiction rather than factual narratives.
As in fictional narratives, all the elements in a conspiracy  are linked through clear lines of cause and effect. There's a reason for everything and, if that reason isn't immediately forthcoming, it's because it's being purposefully hidden as part of the conspiracy. This differs from real life of course, where events often include large amounts of happenstance, inexplicable phenomena and a general murkiness and confusion.
Same story
Then there's the way that  theories are all underpinned by the same basic archetype: what the writer Christopher Booker calls the "overcoming the monster" story. In this, a single or a small group of rebels take on the overwhelming forces of a corrupt and malevolent establishment which is threatening the wellbeing of society.
Crowdstrike slots snuggly into this formula. Corrupt forces within the political establishment (in this case the Democratic Party) are presented as betraying the will of the people—represented by the election of Trump in 2016. The ongoing impeachment process against the president therefore threatens the welfare of the US as an independent democratic nation. As the political theorist Jan-Werner Muller has noted, this type of  is structurally embedded in the logic of all populist movements in the way their leaders regularly argue that the will of the people can only be denied through underhand and corrupt ways.
Conspiracy theories always fixate on a very simple story which acts as a fable for their overarching worldview. They usually take an issue of real significance—such as foreign influence in domestic elections—but, in order to explain it, they latch on to one succinct story which bypasses the complexities and messiness of real-life phenomena and instead satisfies the logic of their overarching ideological narrative.
For Trump's supporters, the Crowdstrike story feels true because it's another example of the establishment's great witch hunt against him. As a story, it also has a coherent logic which the expanse and messiness of the facts lack. So, in both these ways, our familiarity with the way the world is mediated via fiction helps cast doubt on the way the world actually is.

Study shows there's nothing wacky about conspiracy theorists


Study shows there’s nothing wacky about conspiracy theorists »
Credit: Pixabay
Researchers at The Australian National University (ANU) have delved into the world of online conspiracy theories, showing most of the people behind them are actually pretty ordinary.

The study looked at eight years of content, sifting through more than two billion comments posted on Reddit, including everything posted to the subreddit r/.
R/conspiracy covers everything from UFOs and 9/11, to political conspiracy theories like 'pizzagate', which took off during the 2016 US election campaign.
But despite the subject matter, lead author of the study Dr. Colin Klein says conspiracy theorists aren't always a bunch of "crackpots wearing tinfoil hats".
"In the past before the rise of online forums like Reddit, we tended to only hear about the most , and those people tended to naturally be wary about talking to someone else about their beliefs," Dr. Klein said.
"These massive online forums paint a very different picture.
"The enormous set of comments we examined show many r/conspiracy users actually have more 'sensible' interests.
"For example conspiracy theories about police abuse of power are common. That's not so crazy.
"These people might believe false things, but with good reason—because similar things have happened in the past."
Professor Klein and his team also found that while there are subtle differences in the language used by those who end up posting on r/conspiracy, it's not necessarily enough to set them apart from other Reddit users.
"You might find they talk more about power or power structures, but their language is not that different from what ordinarily goes on in a  like r/politics. You can't distinguish them that way.
"It's very easy to look at conspiracy theories and think they're super wacky, and the people who believe in them are crazy, but it's actually much more continuous with a lot of things we do every day.
"Low level theorising goes on a lot in , I'm inclined to think the stuff you see online is just a strong outgrowth of that."
According to Dr. Klein, forums like r/conspiracy can also be driven by current events.
"For example, Reddit attracted a whole new set of users following the election of US President Donald Trump.
"He also generates quite a lot of in-fighting amongst users. This is what makes it such great way to study social dynamics."
The data also reveals how people come to start posting on the r/conspiracy forum. The rise of Internet echo chambers is a factor—but there's much more at play.
"We followed people who started using Reddit and posted for about six months before they ended up on r/conspiracy," Dr. Klein said.
"You find two people who, for example, both started on the popular 'ask me anything' Reddit, and one ends up talking about conspiracies and one doesn't.
"People who go on to post on r/conspiracy also tend to be over-represented in the political forums, but it's not like they're hyper-focused.
"This suggests a more active process where people are seeking out sympathetic communities. This process of finding like-minded people is something we see a lot of on the Internet."

The research has been published in the journal PLOS ONE.

Conspiracy theorists actively seek out their online communities


Don't (just) blame echo chambers. Conspiracy theorists actively seek out their online communities
The term illuminati has been used since the late 15th century, and applied to various groups since then. It’s often discussed by conspiracy theorists, and is heavily referenced in pop-culture. Credit: Lettuce./FlickrCC BY-NC-ND
Why do people believe conspiracy theories? Is it because of who they are, what they've encountered, or a combination of both?

The answer is important. Belief in  theories helps fuel climate change denial, anti-vaccination stances, racism, and distrust of the media and science.
In a paper published today, we shed light on the online world of conspiracy theorists, by studying a large set of user comments.
Our key findings are that people who eventually engage with conspiracy forums differ from those who don't in both where and what they post. The patterns of difference suggest they actively seek out sympathetic communities, rather than passively stumbling into problematic beliefs.
We looked at eight years of comments posted on the popular website Reddit, a platform hosting millions of individual forums called subreddits.
Our aim was to find out the main differences between users who post in r/conspiracy (a subreddit dedicated to conspiracy theories) and other Reddit users.
Using a technique called sentiment analysis we examined what users said, and where they said it, during the months before their first post in r/conspiracy.
We compared these posts to those of other users who started posting on Reddit at the same time, and in the same subreddits, but without going on to post in r/conspiracy.
We then constructed a network of the subreddits through which r/conspiracy posters traveled. In doing so, we were able to discover how and why they reached their destination.
Seeking the like-minded
Our research suggests there is evidence for the "self-selection" of conspiracy theorists. This means users appear to be seeking communities of people who share their views.
Users followed clear pathways to eventually reach r/conspiracy.
For example, these users were over-represented in subreddits focused on politics, drugs and internet culture, and engaged with such topics more often than their matched pairs.
We were also surprised by the diversity of pathways taken to get to r/conspiracy. The users were not as concentrated on one side of the political spectrum as people might expect. Nor did we find more anxiety in their posts, compared with other users.
Our previous research also indicated online conspiracy theorists are more diverse and ordinary than most people assume.
Where do the beliefs come from?
To dig deeper, we examined the interactions between where and what r/conspiracy users posted.
In political subreddits, the language used by them and their matched pairs was quite similar. However, in Reddit's very popular general-purpose subreddits, the linguistic differences between the two groups were striking.
So far, psychologists, sociologists, and philosophers have struggled to find anything distinct about conspiracy believers or their environments.
Social media can play a role in spreading conspiracy theories, but it mostly entrenches beliefs among those who already have them. Thus it can be challenging to measure and understand how conspiracy beliefs arise.
Traditional survey and interview approaches don't always give reliable responses. This is because conspiracy theorists often frame their life in narratives of conversation and awakening, which can obscure the more complex origins of their beliefs.
Furthermore, as philosopher David Coady pointed out, some  turn out to be true. Insiders do sometimes uncover evidence of malfeasance and cover-ups, as recent debates over the need for whistleblower protections in Australia reflect.
Echo chambers worsen the problem
Research about online radicalization from philosophy has focused on the passive effects of technologies such as recommended algorithms and their role in creating online echo chambers.
Our research instead suggests individuals seem to have a more active role in finding like-minded communities, before their interactions in such communities reinforce their beliefs.
These "person-situation interactions" are clearly important and under-theorized.
As the psychologist David C. Funder puts it: "Individuals do not just passively find themselves in the situations of their lives; they often actively seek and choose them. Thus, while a certain kind of bar may tend to generate a situation that creates fights around closing time, only a certain kind of person will choose to go to that kind of bar in the first place."
We suspect a similar process leads users to conspiracy forums.
A complex web of interactions
Our data indicates that conspiracy beliefs, like most beliefs, are not adopted in a vacuum. They are actively mulled over, discussed, and sought out by agents in a social (and increasingly online) world.
And when forums like 8chan and Stormfront are pushed offline, users often look for other ways to communicate.
These complex interactions are growing in number, and technology can amplify their effects.
YouTube radicalization, for example, is likely driven by interactions between algorithms and self-selected communities.
When it comes to conspiracy beliefs, more work needs to be done to understand the interplay between a person's social environment and their information seeking behavior.
And this becomes even more pressing as we learn more about the risks that come with conspiracy theorizing.

More information: Colin Klein et al. Pathways to conspiracy: The social and linguistic precursors of involvement in Reddit's conspiracy theory forum, PLOS ONE (2019). DOI: 10.1371/journal.pone.0225098
Journal information: PLoS ONE 

No comments: