Saturday, June 11, 2022

White supremacists are riling up thousands on social media



The Associated Press
Published: 10 June ,2022

The social media posts are of a distinct type. They hint darkly that the CIA or the FBI are behind mass shootings. They traffic in racist and sexist tropes. They revel in the prospect of a “white boy summer.”

White nationalists and supremacists, on accounts often run by young men, are building thriving, macho communities across social media platforms like Instagram, Telegram and TikTok, evading detection with coded hashtags and innuendo.

Their snarky memes and trendy videos are riling up thousands of followers on divisive issues, like abortion, guns, and immigration.

The Department of Homeland Security warned Tuesday that such skewed framing of the subjects could drive extremists to violently attack public places across the US in the coming months.

These type of threats and racist ideology have become so commonplace on social media that it’s nearly impossible for law enforcement to separate internet ramblings from dangerous, potentially violent people, Michael German, who infiltrated white supremacy groups as an FBI agent, told the Senate Judiciary Committee on Tuesday.

“It seems intuitive that effective social media monitoring might provide clues to help law enforcement prevent attacks,” German said. “After all, the white supremacist attackers in Buffalo, Pittsburgh and El Paso all gained access to materials online and expressed their hateful, violent intentions on social media.”

But, he continued, “so many false alarms drown out threats.”

DHS and the FBI are also working with state and local agencies to raise awareness about the increased threat around the US in the coming months.

The heightened concern comes just weeks after an 18-year-old white man entered a supermarket in Buffalo, New York, with the goal of killing as many Black patrons as possible. He gunned down 10.

That shooter claims to have been introduced to neo-Nazi websites and a livestream of the 2019 Christchurch, New Zealand, mosque shootings on the anonymous, online messaging board 4Chan.

In 2018, the white man who gunned down 11 at a Pittsburgh synagogue shared his antisemitic rants on Gab, a site that attracts extremists. The year before, a 21-year-old white man who killed 23 people at a Walmart in the largely Hispanic city of El Paso, Texas, shared his anti-immigrant hate on the messaging board 8Chan.

References to hate-filled ideologies are more elusive across mainstream platforms like Twitter, Instagram, TikTok and Telegram. To avoid detection from artificial intelligence-powered moderation, users don’t use obvious terms like “white genocide” or “white power” in conversation.

They signal their beliefs in other ways: a Christian cross emoji in their profile or words like “anglo” or “pilled,” a term embraced by far-right chatrooms, in usernames.

Most recently, some of these accounts have borrowed the pop song “White Boy Summer” to cheer on the leaked Supreme Court draft opinion on Roe v. Wade, according to an analysis by Zignal Labs, a social media intelligence firm.

Facebook and Instagram owner Meta banned praise and support for white nationalist and separatists movements in 2019 on company platforms, but the social media shift to subtlety makes it difficult to moderate the posts.

“We know these groups are determined to find new ways to try to evade our policies, and that’s why we invest in people and technology and work with outside experts to constantly update and improve our enforcement efforts,” David Tessler, the head of dangerous organizations and individuals policy for Meta, said in a statement.

A closer look reveals hundreds of posts steeped in sexist, antisemitic, and racist content.

In one Instagram post identified by The Associated Press, an account called White Primacy appeared to post a photo of a billboard that describes a common way Jewish people were exterminated during the Holocaust.

“We’re just 75 years since the gas chambers. So no, a billboard calling out bigotry against Jews isn’t an overreaction,” the pictured billboard said.

The caption of the post, however, denied gas chambers were used at all. The post’s comments were even worse: “If what they said really happened, we’d be in such a better place,” one user commented. “We’re going to finish what they started someday,” another wrote.

The account, which had more than 4,000 followers, was immediately removed Tuesday, after the AP asked Meta about it. Meta has banned posts that deny the Holocaust on its platform since 2020.

US extremists are mimicking the social media strategy used by ISIS, which turned to subtle language and images across Telegram, Facebook and YouTube a decade ago to evade the industry-wide crackdown of the terrorist group’s online presence, said Mia Bloom, a communications professor at Georgia State University.

“They’re trying to recruit,” said Bloom, who has researched social media use for both ISIS and far-right extremists. “We’re starting to see some of the same patterns with ISIS and the far-right. The coded speech, the ways to evade AI. The groups were appealing to a younger and younger crowd.”

For example, on Instagram, one of the most popular apps for teens and young adults, white supremacists amplify each other’s content daily and point their followers to new accounts.

No comments: