Wednesday, November 03, 2021

TikTok's extremism problem: Platform recommends jihad, white supremacy and anti-Semitism to teen accounts

LONG READ
RAWSTORY.COM
November 03, 2021

TikTok's logo. (Shutterstock)

"You will fight the Jews!" "Welcome to the life of jihad." "Your mum will smile at you burning in the hellfire!"

While these might sound like quotes delivered at a madrasa in Pakistan, they're not. They're quotes TikTok recommended to an account Raw Story set up as if it were 13 years old, the age of a typical American eighth grader, within 24 hours of signing up for the app.

TikTok is a social media app that streams short user-uploaded videos, recommending additional clips based on what users stop to watch. Prior Raw Story reports have revealed that TikTok's algorithm recommends firearms acccessories to minors, as well as a plethora of videos about suicide and self-harm.

Raw Story's investigation also found that TikTok can introduce teens to extremist content. A teenage account that simply searched the word "Muslim" was eventually shown propaganda sympathizing with the Islamic State, calls for jihad, and videos championing anti-Semitism and child marriage. Some content also contained black flag emojis, symbolizing the flag of the Islamic State.

TikTok also recommended white nationalist videos that glorified the Confederate flag and spread misinformation about the flag's origin. One video showed a man standing next to a Confederate flag referring to the flag design as "a Christian symbol." TikTok also recommended videos supportive of Nazi Germany. One clip referenced rising right-wing nationalism in Japan, with a caption showing the German and Italian flags saying, "Guys the gang is back! We're back!" World War II's Axis powers used their ideologies to justify genocide.

Radical Islamist content on social media has long been a gateway to terrorist organizations like Al-Qaeda and the Islamic State. With new social media apps on the rise, and platforms like Facebook revising algorithms to keep users as engaged as possible, bad actors have taken to social media as a tried and true recruiting tool. In these cases, the future of recruitment is not a matter of the individual finding their cause, but their cause being advertised to the individual effortlessly.

Other extremist groups Raw Story found included white nationalists, conspiracy theorists, and cults. TikTok recommended the videos to teen accounts even if the user was not actively searching for them.

Others have discovered even more odious content. In August, the London-based Institute for Strategic Dialogue uncovered more than two dozen TikTok videos recreating the massacre of Muslims and 29 videos denying the Holocaust. The group also found ISIS propaganda, including a video of two Japanese hostages moments before their death and aerial drone footage of ISIS car bomb attacks.

Have tips about TikTok or internal documents about tech companies? Email techtips@rawstory.com.

TikTok did not comment when Raw Story asked about Islamist extremist content on their platform.

"This is an industry-wide challenge complicated by bad actors who actively seek to circumvent protective measures, but we have a team dedicated to aggressively protecting against malicious behavior on TikTok," a spokeswoman said in 2019.

"We permanently ban any such accounts and associated devices as soon as identified, and we continuously develop ever-stronger controls to proactively detect suspicious activity," the spokeswoman added.

TikTok's response came after a 2019 report that Islamic State militants were posting on its platform. In July, TikTok said it removed more than 300,000 videos for spreading "violent extremism" in a three month period earlier this year.

Raw Story created a TikTok account with a 13-year-old birthday and searched the word "Muslim." Soon after liking innocuous videos containing Gordon Ramsay judging a crème brûlée preparation and dad jokes, TikTok's algorithm began recommending videos from extremist Islamist preachers located all over the globe.

"Allah does not look for lip service. Allah looks for actions," said one preacher. "I'll die for Allah's path to righteousness."

"The one that's not wearing the hijab and the one that's not covering up, the prophet says, they would never enter the (sic) paradise," shouted another.

The first preacher, Mohamed Hoblos, came under fire for suggesting that rape and murder are lesser sins than missing prayer. The second, Australian Sheikh Shady Alsuleiman, spreads anti-gay conspiracy theories. TikTok's "Mohamed Hoblos" page says his videos have 53 million views.

TikTok also recommended a video from blogger and preacher Imran Ibn Mansur, better known as "Dawa Man." In one video TikTok suggested to Raw Story's teen account, Ibn Mansur stands over what appears to be a cadaver covered in an Islamic mortuary, a common sight on Raw Story's account.

"Let me show you something," he says. "This is a dead man."


London University banned Ibn Mansur from speaking in 2014 for referring to homosexuality as a "filthy disease" which must be "suppressed."

TikTok often recommended that Raw Story watch videos with the hashtag #jihad. Interaction by either liking or lingering on such videos triggered TikTok to recommend more extreme content. Despite the word being on their list of blocked search terms, hashtags like #jihad show up organically in TikTok's suggestions.

One video TikTok recommended, now taken down, showed Islamic State propaganda with messages to "welcome a life of Jihad" and to "get a gun to welcome my fate" set to Islamic chanting for holy war. More Islamic State propaganda stated, "One word which we have said is that the return of the caliphate is the promise of Allah!"

TikTok also suggested anti-Semitic videos. "Muhammad said you will fight the Jews," one unidentified preacher declared. "Take not the Jews and the Christians as friends," read text in another.

Another TikTok video showed a group of Jewish men congregating in a street in what appeared to be Israel. "Boy, you just wait until they open the borders, bro," the video's narrator said. "I swear to God I'm coming for you."


"Israel and America… fuck you," one woman yelled in a video montage of anti-Semitic messages. "Let us spit on this flag," another declared before setting an Israeli flag on fire. A third video showed a defaced photo of Israeli TikTok influencer Yael Deri. Many accounts had black flag emojis in their bios.



Experts expressed concern about TikTok recommending extremist content to minors. Seamus Hughes, Deputy Director of the Program of Extremism at George Washington University, noted that social media has been fertile ground for extremist recruiting. ISIS, he said, has a track record of recruiting very young teens.
"We have seen as young as 15-years-old," Hughes said.

In 2015, a 15-year-old boy was arrested by the FBI outside of Philadelphia for allegedly plotting to target Pope Francis in an ISIS-inspired attack. While recruitment is down, Hughes noted, it still continues.

"We are still seeing an average of one arrest per month for ISIS related cases," he said.

Devorah Margolin, Director of Strategic Initiatives at GW's extremism program, said she doesn't think the TikTok videos involve actual Islamic State recruiters. It's more likely that they're fan accounts that reupload ISIS propaganda to the platform, she said. But she stressed that fan content can still lead users to extremist causes.

"Islamic State fan content has the possibility of radicalizing and recruiting people to join the Islamic State's cause," she said. "Research shows that people typically radicalize with others, including friends, family, and peers. As such, having this content online risks like-minded individuals finding each other and radicalizing together."

Social media algorithms have been found to radicalize users. In 2019, the New York Times reported that after YouTube implemented a new artificial intelligence-based recommendation system, the platform fed Brazilians users conspiracy theories and extreme right-wing content. The effects echoed across Brazilian life. Teachers bemoaned unruly students who quoted from YouTube conspiracy videos. YouTube-sourced health misinformation undermined efforts to fight Zika. A little-known lawmaker who became a YouTube star — Jair Bolsonaro — is now Brazil's president.

Imran Ahmed, founder and CEO of the Center for Countering Digital Hate, says that conspiracy theories can also be a gateway to extremist content on apps like TikTok.

"There are people who are willing to exploit people's vulnerabilities and weaknesses to bring them into their thrall, and they understand the power of social media," said Ahmed. "They know how to create content that exploits the algorithm."

Social media companies "have engaged in a program of systematic deceit of the public, journalists and lawmakers," Ahmed said.

TikTok has found itself caught between a rock in a hard place when dealing with controversial content. After taking down videos, it has sometimes been accused of censorship. In 2019, some users accused it of removing content to appease China, where the company is based. The firm subsequently hired two former U.S. legislators to review its content moderation policy.

TikTok's website provides a guardian's guide offering parents safety suggestions for minors. The features include Family Pairing, where parents can control privacy settings, limit what content teens see, and control how much time is spent on the app. These safety suggestions are far from foolproof, however, since children can easily create a new account in seconds using a new email address.

Extremist recruitment often takes the form of sharing accomplishments on existing online conversations in order to direct traffic to radical Islamist causes. In 2014, ISIS notoriously manipulated the Twitter algorithm by using trending hashtags in an attempt to hijack the World Cup tournament with radical Islamist propaganda.

"We see a hijacking of hashtags," said Hughes. "At one point, ISIS was tweeting the World Cup."

TikTok provides a unique opportunity for extremists, because most of its users are young. The short video format also offers sympathizers a different way to build excitement.

"The rhyme, beat, evocative lyrics and punchy delivery are especially appealing to youth," Elisabeth Kendall, an Oxford University expert on extremism, recently said. "This catchy sing-along method for propagating ISIS ideology means it spreads quickly and sticks in the collective memory. It tends to be far more effective than sermons or theological debate and treatises."

Extremists on TikTok also target other groups. A report this year from the Institute for Strategic dialogue found rampant anti-Muslim extremism, including videos depicting the 2019 Christchurch mosque shootings, in which an Australian white supremacist killed 51 people after opening fire at two mosques.

The Institute found more than two dozen TikTok videos expressing support for the killer, "including 13 videos containing content [he] originally produced" and "three videos that feature video game footage designed to recreate the events."

They also found a panoply of other extremist videos. The most-viewed of the videos, the report said, was an anti-Asian clip referencing Covid-19 TikTok played two million times.

"Three of the top ten most-viewed videos, with a collective 3.5 million views, featured content first produced by jailed white supremacist Paul Miller," the report added. "Two of the top ten most-viewed videos... feature comments that mock the victims and deny the existence of the Bosnian genocide and the Holocaust."

Miller, a vocal white nationalist, has been banned from YouTube and other social media platforms.

While the Islamic State has faded from U.S. view, the threat from global terrorist networks is far from over. Just yesterday, the Islamic State claimed responsibility for an attack on a Kabul hospital that claimed at least 23 lives. The UN Security Council released reports this year suggesting ISIS plans to return to violent attacks to make up for lost time during the COVID-19 pandemic.

In August, the U.S. Department of Homeland Security noted that Al-Qaeda released their magazine Inspire in English for the first time in four years, indicating that English-based recruitment remains on the terrorist group's agenda.

Three months ago, Imran Ali Rasheed of Garland, Texas shot and killed his Lyft driver after she drove him to a police station where he attempted to carry out a mass shooting. Rasheed left a note in Lewis' car attributing his acts to an unnamed radical Islamist terror group. He was killed by police.

As COVID-19 restrictions ease and more people congregate in large groups, the threat of terrorist attacks has grown. Despite TikTok's community guidelines banning extremist content, the platform continues to serve as a network for like-minded extremists. Margolin, the George Washington University extremist expert, said TikTok should focus on scrubbing extremist content from their platform.

"If content on TikTok is advocating active harm, travel to join a terrorist organization, or dehumanizing those they deem to be part of an 'out-group,' it should be removed," she said.

No comments:

Post a Comment