Social media users’ actions, rather than biased policies, could drive differences in platform enforcement
New MIT Sloan research has found that politically conservative users tend to share misinformation at a greater volume than politically liberal users and this could explain why conservatives were suspended more frequently
MIT Sloan School of Management
A new paper, “Differences in misinformation sharing can lead to politically asymmetric sanctions,” published today in Nature suggests that the higher quantity of social media policy enforcement (such as account suspensions) for conservative users could be explained by the higher quantity of misinformation shared by those conservative users — and so does not constitute evidence of inherent biases in the policies from social media companies or in the definition of what constitutes misinformation.
Written by researchers from MIT Sloan School of Management, the University of Oxford, Cornell University, and Yale University, co-authors of the paper include Mohsen Mosleh, Qi Yang, Tauhid Zaman, Gordon Pennycook and David G. Rand.
The spread of misinformation has become an increasing concern, especially as the 2024 presidential election in the United States approaches. Many Americans who disagree on political issues agree that the sharing of false information is a substantial problem; sixty-five percent of Americans say that technology companies should take action to restrict the spread of false information. However, there is great dissension as to whether tech companies are actually moderating platforms fairly.
“Accusations of political bias are often based largely on anecdotes or noteworthy cases, such as the suspension from Twitter and Facebook of former President Trump,” said MIT Sloan professor Rand. “This study allows us to systematically evaluate the data and better understand the differential rates of policy enforcement.”
The asymmetry of conservative sanctions versus liberal sanctions should not be attributed to partisan bias on the part of social media companies and those determining what counts as misinformation, Rand and the co-authors noted.
The research began by looking at Twitter’s suspension of users following the 2020 U.S. presidential election. Researchers identified 100,000 Twitter users from October 2020 who shared hashtags related to the election, and randomly sampled 9,000 — half of whom shared at least one #VoteBidenHarris2020 hashtag and half of whom shared at least one #Trump2020 hashtag. Researchers analyzed each user’s data from the month before the election to quantify their tendency to share news from low-quality domains (as well as other potentially relevant characteristics), and then checked nine months later to determine which users were suspended by Twitter.
Accounts that had shared #Trump2020 before the election were 4.4 times more likely to have been subsequently suspended than those who shared #VoteBidenHarris2020. Only 4.5% of the users who shared Biden hashtags had been suspended as of July 2021, while 19.6% of the users who shared Trump hashtags had been suspended.
“We found that there were political differences in behavior, in addition to the political differences in enforcement,” said Rand. “The fact that the social media accounts of conservatives are suspended more than those of liberals is therefore not evidence of bias on the part of tech companies, and shouldn’t be used to pressure tech companies to abandon policies meant to reduce the sharing of misinformation.”
To better understand this difference, the researchers examined what content was shared by these politically active Twitter users in terms of the reliability of the sources through two different methods. They used a set of 60 news domains (the 20 highest volume sites within the categories of mainstream, hyper-partisan and fake news), and collected trustworthiness ratings for each domain from eight professional fact-checkers. In an effort to eliminate concern about potential bias on the part of journalists and fact-checkers, the researchers also collected ratings from politically-balanced groups of laypeople. Both approaches indicated that people who used Trump hashtags shared four times more links to low-quality news outlets than those who used Biden hashtags.
“Prior work identifying political differences in misinformation sharing has been criticized for relying on the judgment of professional fact-checkers. But we show that conservative Twitter users shared much lower quality news, even when relying on ratings from politically-balanced groups of laypeople,” said co-author Dr Mohsen Mosleh, Associate Professor, Oxford Internet Institute, part of the University of Oxford. “This can’t be written off as the result of political bias in the ratings, and means that preferential suspension of conservative users is not necessarily the result of political bias on the part of social media companies.”
The study also discovered similar associations between conservatism and low-quality news sharing (based on both expert and politically-balanced layperson ratings) were present in seven other datasets from Twitter, Facebook, and survey experiments, spanning 2016 to 2023 and including data from 16 different countries. For example, the researchers found cross-cultural evidence of conservatives sharing more unambiguously false claims about COVID-19 than liberals, with conservative political elites sharing links to lower quality new sources than liberal political elites in the U.K. and Germany as well.
“The social media users analyzed in this research are not representative of Americans more broadly, so these findings do not necessarily mean that conservatives in general are more likely to spread misinformation than liberals. Also, we’re just looking at this particular period in time,” said Rand. “Our basic point would be the same if it was found that liberal users shared more misinformation and were getting suspended more. Such a pattern of suspension would not be enough to show bias on the part of the companies, because of the differences in users’ behavior.”
Even under politically neutral anti-misinformation policies, the researchers expect that there would be political asymmetries in enforcement. While the analyses do not rule out the possibility of any bias on the part of platforms, the inequality of sanctions is not diagnostic of bias one way or the other. Policy-makers need to be aware that even if social media companies are working in an unbiased way to manage misinformation on their platforms, there will still be some level of differential treatment across groups.
Journal
Nature
Article Title
Differences in misinformation sharing can lead to politically asymmetric sanctions
Solidarity drives online virality in a nation under attack, study of Ukrainian social media reveals
The first major study of social media behaviour during wartime has found that posts celebrating national and cultural unity in a country under attack receive significantly more online engagement than derogatory posts about the aggressors.
University of Cambridge psychologists analysed a total of 1.6 million posts on Facebook and Twitter (now X) from Ukrainian news outlets in the seven months prior to February 2022, when Russian forces invaded, and the six months that followed.
Once the attempted invasion had begun, posts classified as expressing Ukrainian “ingroup solidarity” were associated with 92% more engagement on Facebook, and 68% more on Twitter, than similar posts had achieved prior to Russia’s full-scale attack.
While posts expressing “outgroup hostility” towards Russia only received an extra 1% engagement on Facebook after the invasion, with no significant difference on Twitter.
“Pro-Ukrainian sentiment, phrases such as Glory to Ukraine and posts about Ukrainian military heroism, gained huge amounts of likes and shares, yet hostile posts aimed at Russia barely registered,” said Yara Kyrychenko, from Cambridge’s Social Decision-Making Lab (SDML) in its Department of Psychology.
“The vast majority of research on social media uses US data, where divisive posts often go viral, prompting some scholars to suggest that these platforms drive polarisation. In Ukraine, a country under siege, we find the reverse,” said Kyrychenko, lead author of the study published today in Nature Communications.
“Emotions that appeal to ingroup identity can empower people and boost morale. These emotions may be more contagious, and prompt greater engagement, during a time of active threat – when the motivation to behave beneficially for one’s ingroup is heightened.”
Previous research from the same Cambridge lab found that going viral on US social media is driven by hostility: posts that mock and criticise the opposing sides of ideological divides are far more likely to get engagement and reach larger audiences.
The new study initially used the same techniques, finding that – prior to the invasion –social media posts from pro-Ukrainian as well as pro-Russian news sources that contained keywords of the ‘outgroup’ – opposing politicians, placenames, and so on – it did indeed generate more traction than posts containing ‘ingroup’ keywords.*
However, researchers then trained a large language model (LLM) – a form of language-processing AI, similar to ChatGPT – to better categorise sentiment and the motivation behind the post, rather than simply relying on keywords, and used this to analyse Facebook and Twitter posts of Ukrainian news outlets before and after the invasion.**
This deeper dive revealed a consistently strong engagement rate for solidarity posting – higher than for ‘outgroup hostility’ – in the lead up to Russia’s attack, which leaps even further after the invasion, while interactions with derisive posts about Russia flatline.
Lastly, a separate dataset of 149,000 post-invasion Tweets that had been geo-located to Ukraine was fed into a similar LLM, to test this effect on social media posts from the Ukrainian population, rather than only news sources.***
Tweets – now X posts – from the Ukrainian public containing messages of “ingroup solidarity” championing Ukraine were likely to get 14% more engagement, while those expressing antagonism to Russians were likely to gain only a 7% increase.****
“Social media platforms allow expressions of the national struggle that would otherwise have been private to reach millions,” said Kyrychenko.
“These moments echo solidarity and resistance from a first-person account, which can make them more powerful than traditional media rooted in impersonal reporting.”
Researchers acknowledge these trends may result from algorithms used by social media companies, but say the fact that similar effects were detected on two separate platforms, and with posts from both Ukraine’s news sources and its citizenry, suggests much of this information-sharing dynamic is driven by people.
“The Kremlin has long tried to sow division in Ukraine, but fails to understand that the Euromaidan revolution and Russia’s attempted invasion have only spurred Ukrainian identity towards national unity,” said Dr Jon Roozenbeek, study senior author from Cambridge’s SDML as well as King’s College London.
“We can trace through social media posts this fortification of Ukrainian group identity in the face of extreme Russian aggression,” said Roozenbeek, who published the book Propaganda and Ideology in the Russian–Ukrainian War earlier this year.
Kyrychenko, a Cambridge Gates Scholar born and raised in Kyiv, recalls the critical role Facebook and Twitter played in the Euromaidan protests in 2014, some of which she participated in as a teenager, and her surprise at the attitude towards social media she encountered in the US after moving there to study in 2018, during the Trump presidency.
“By the time I arrived in the US, social media was seen as toxic and divisive, whereas my experience of these platforms in Ukraine had been as a force for positive political unity in the fight for democracy,” said Kyrychenko.
While Kyrychenko points out that hate speech and conspiracy theories still thrive online in Ukraine, she argues that the solidarity fostered on social media reflects some of the early promise these platforms held for uniting people against tyranny.
“The Ukrainian experience reminds us that social media can be used for good, pro-social causes, even in the direst of situations.”
NOTES:
* Facebook and Twitter were banned in Russia following the invasion. As such, this initial element of the study was the only one to feature Russian social media posts.
** The team manually labelled 1600 Ukrainian social media posts as either “ingroup solidarity” or “outgroup hostility” based on whether they praised Ukraine and promoted national unity or attacked Russia as immoral warmongers, and fed these into the LLM to train it to read and categorise Ukrainian social media posts. The researchers also provided the LLM guideline definitions for “ingroup solidarity” and “outgroup hostility”.
*** The researchers only used pro-Ukrainian Tweets: posts that were supportive of Ukraine, whether through attacking Russia or championing Ukraine.
**** For example, if the LLM labelled a post as ‘ingroup solidarity’, it was likely to get 14% more engagement than if it was not labelled as ‘ingroup solidarity’, controlling for other variables such as: if the post has media or a URL, if it mentioned the ingroup, the outgroup, the number of ‘positive’ words, and so on.
Examples of social media posts that were part of the study’s dataset:
Ingroup solidarity:
- "Thanks to the KALUSH ORCHESTRA band for their support! Glory to Ukraine! πΊπ¦" got 4434 retweets.
- "Our flag will fly over all of Ukraine, said General Valery Zaluzhnyi." got 5577 favorites and 767 retweets.
- "Ukrainian soldiers congratulate students with September 1 and remembers their first bells ππ" ... got 92381 shares and 482896 likes on Facebook.
- "In a Polish church, they decided to sing the song "Oh, there's a red viburnum in the meadow" right during the service! πΊπ¦ ❤️π΅π±" ... got 34897 shares and 68847 likes on Facebook.
A further description from lead author Yara Kyrychenko of an example of Ukrainian ‘ingroup solidarity’ social media content:
“On New Year’s Eve 2022, a family in the then recently de-occupied Ukrainian city of Kherson watched Volodymyr Zelensky’s presidential address over WhatsApp with their relatives in the still-occupied territories.
“A video of the entire family crying – as Zelensky states Ukraine will liberate and rebuild – quickly went viral across platforms. It captured something so powerful and deeply emotional that watching it makes many cry, even months later.
“The sense of unity despite barriers, the tender cherishing of the national tradition, and the human connection—all distilled into one TikTok. Posts like these evoke similar feelings of solidarity in countless Ukrainians, even though each has seen a different face of the war.”
Outgroup Hostility:
- "Boris Johnson: negotiating with Putin is like negotiating with a crocodile" got 425 retweets and 4957 favorites.
- "It hurts to understand that these bastards shoot absolutely everything. It doesn't matter if the military is there or not. Hospitals, schools...." got 21728 Shares and 25125 Likes.
- "❗️Russians don't want to fight for Putin. The story of a soldier captured in Kharkov. "Bastards! I hate them! They are making propaganda!"" ... got 65409 shares and 79735 Likes.
Journal
Nature Communications
Article Title
Social identity correlates of social media engagement before and after the 2022 Russian invasion of Ukraine
Article Publication Date
1-Oct-2024