Andrew Tate's been banned from social media. But his harmful content still reaches young men
Brock Wilson - CBC
Last month, controversial influencer Andrew Tate was banned on several social media platforms for violating their policies.
Despite being banned from TikTok for nearly two weeks, a quick search produces countless videos,
like this one, of Andrew Tate.© @benleavitt/TikTok
But nearly two weeks into these bans, platforms are still inundated with clips of Tate making derogatory comments about women — highlighting what some media experts suggest is part of a dangerous system whose algorithm can be manipulated to radicalize young men to adopt harmful views against women and the LGBTQ community.
And as Tate's case shows, banning controversial figures can actually make the problem worse.
Tate, a former kick-boxer, gained notoriety after appearing on the U.K. reality show Big Brother in 2016. He was removed from the show when a video of him appearing to assault a woman with a belt was made public. Tate has said the incident was consensual.
Recently, he went viral for soundbites shared on platforms like TikTok. These clips feature Tate, often clad in sunglasses sans shirt, making offensive comments about women. One notable example includes clips of Tate saying that if a woman is dating a man she "belongs" to him. In another clip, Tate suggested women in relationships who have their own social media accounts are cheating.
In a video posted to Vimeo on Aug. 23, Tate responded to the bans saying he's been "unfairly vilified" and his comments were taken out of context.
Tate did not respond to a request from CBC News for comment.
Content like Tate's often starts in a way that seems relatively harmless, but then it slowly becomes more nefarious, says Joanna Schroeder, a writer whose work focuses on gender and media representation.
For example, she says, young boys often visit sites like YouTube to search for videos related to Minecraft, a wildly popular video game. But the YouTube algorithm will often guess their age and gender — and Schroeder says it might then push harmful content at them.
"There are people who want to target this demographic who start showing them content that becomes more and more racy."
Schroeder says Tate's appeal is, in part, because of how his views are framed. The idea that what he's saying is an "unpopular opinion that nobody else will say out loud" might suggest to a young person that it has value, she says.
And since "edgy" content often presents itself as something a younger demographic should consider normal — or even find funny — it slowly becomes problematic.
An example of that is the Pepe the Frog meme, something that started as a harmless cartoon frog and devolved into a symbol of hate.
It began as an apolitical meme popular on sites like Myspace and 4chan in the 2000s. But as its popularity grew, it was appropriated by the alt-right movement.
Schroeder says Pepe began to represent "anti-gay" and "anti-women" sentiments. And she says teens might initially perceive the memes as jokes, but over time it can influence how and what young people think.
And clips like Tate's are a common way people are radicalized, says Ellen Chloƫ Bateman, a documentary and podcast producer who's researched online radicalization among young men and incel subculture.
Violence against women gets normalized, she says, embedding itself into the psyches of young men through images and memes, in what she calls a "culture of intense competition and one-upmanship."
Schroeder says this can often be seen on TikTok. Videos featuring clips of creators like Tate will often also share a screen showing video from games like Minecraft or Call of Duty to try to keep teens engaged.
At this point, she says, some social media algorithms notice the user's high levels of engagement — and then begin to serve them more "overtly racist" content.
"Algorithms push content that is often extreme. Extreme views, hate-filled views get a lot of traction on places like YouTube … and TikTok," Schroeder says.
The parts of the internet where these memes, and oftentimes more outright racist or misogynistic content, circulate is a place Bateman calls the "manosphere."
She describes it as a space where "men's rights activists, male separatists, nihilists, sexual predators and trolls — who often share membership with neo-Nazi and alt-right groups — congregate."
"What unites them all is an extreme anti-feminist world view," Bateman says.
And alt-right groups often use this space to target young and impressionable men, she says.
Social media companies say they're actively working to remove this kind of content — as studies have found that hate speech online has correlated with an increase in physical violence and hate crimes.
In Tate's case, TikTok, Facebook and Instagram removed his content.
A TikTok spokesperson said "misogyny is a hateful ideology that is not tolerated on TikTok," and it continues to investigate other accounts and videos that violate its policies.
The spokesperson also said that TikTok is looking at ways to "strengthen enforcement" against this type of harmful content.
That includes a partnership with UN Women and other non-governmental organization seeking to stop Violence Against Women and Girls to launch a new in-app hub to educate users about gender-based violence.
Bateman says partnerships like these are essential in order for social media spaces to become safer and more educational, especially for young people.
Twitter has also taken action against controversial creators. The platform has issued temporary bans to creators like Jordan Peterson, Matt Walsh and Steven Crowder. (Each creator was later allowed back on the app.)
But Schroeder says bans can sometimes be counterintuitive. In Tate's case, it may have, in some ways, actually helped him.
"The bans are just drawing more attention to him," she said. "It's given him a very big microphone."
Bateman agrees, pointing out that these creators often find new apps, like Reddit, Gab, Telegram and Discord, to post their content.
She says some of these platforms are actually harder to monitor because of their closed-group structures or registration requirements, making it more difficult to study and track the content. A site for incel subculture, which promotes misogyny and violence, has upwards of 17,000 users, she found.
"It's such a complicated online world. It's fluid … it's moving. It's spreading around and these groups are interconnecting basically into one big cesspool of hate."
But nearly two weeks into these bans, platforms are still inundated with clips of Tate making derogatory comments about women — highlighting what some media experts suggest is part of a dangerous system whose algorithm can be manipulated to radicalize young men to adopt harmful views against women and the LGBTQ community.
And as Tate's case shows, banning controversial figures can actually make the problem worse.
Tate, a former kick-boxer, gained notoriety after appearing on the U.K. reality show Big Brother in 2016. He was removed from the show when a video of him appearing to assault a woman with a belt was made public. Tate has said the incident was consensual.
Recently, he went viral for soundbites shared on platforms like TikTok. These clips feature Tate, often clad in sunglasses sans shirt, making offensive comments about women. One notable example includes clips of Tate saying that if a woman is dating a man she "belongs" to him. In another clip, Tate suggested women in relationships who have their own social media accounts are cheating.
In a video posted to Vimeo on Aug. 23, Tate responded to the bans saying he's been "unfairly vilified" and his comments were taken out of context.
Tate did not respond to a request from CBC News for comment.
Content like Tate's often starts in a way that seems relatively harmless, but then it slowly becomes more nefarious, says Joanna Schroeder, a writer whose work focuses on gender and media representation.
For example, she says, young boys often visit sites like YouTube to search for videos related to Minecraft, a wildly popular video game. But the YouTube algorithm will often guess their age and gender — and Schroeder says it might then push harmful content at them.
"There are people who want to target this demographic who start showing them content that becomes more and more racy."
Schroeder says Tate's appeal is, in part, because of how his views are framed. The idea that what he's saying is an "unpopular opinion that nobody else will say out loud" might suggest to a young person that it has value, she says.
And since "edgy" content often presents itself as something a younger demographic should consider normal — or even find funny — it slowly becomes problematic.
An example of that is the Pepe the Frog meme, something that started as a harmless cartoon frog and devolved into a symbol of hate.
It began as an apolitical meme popular on sites like Myspace and 4chan in the 2000s. But as its popularity grew, it was appropriated by the alt-right movement.
Schroeder says Pepe began to represent "anti-gay" and "anti-women" sentiments. And she says teens might initially perceive the memes as jokes, but over time it can influence how and what young people think.
And clips like Tate's are a common way people are radicalized, says Ellen Chloƫ Bateman, a documentary and podcast producer who's researched online radicalization among young men and incel subculture.
Violence against women gets normalized, she says, embedding itself into the psyches of young men through images and memes, in what she calls a "culture of intense competition and one-upmanship."
Schroeder says this can often be seen on TikTok. Videos featuring clips of creators like Tate will often also share a screen showing video from games like Minecraft or Call of Duty to try to keep teens engaged.
At this point, she says, some social media algorithms notice the user's high levels of engagement — and then begin to serve them more "overtly racist" content.
"Algorithms push content that is often extreme. Extreme views, hate-filled views get a lot of traction on places like YouTube … and TikTok," Schroeder says.
The parts of the internet where these memes, and oftentimes more outright racist or misogynistic content, circulate is a place Bateman calls the "manosphere."
She describes it as a space where "men's rights activists, male separatists, nihilists, sexual predators and trolls — who often share membership with neo-Nazi and alt-right groups — congregate."
"What unites them all is an extreme anti-feminist world view," Bateman says.
And alt-right groups often use this space to target young and impressionable men, she says.
Social media companies say they're actively working to remove this kind of content — as studies have found that hate speech online has correlated with an increase in physical violence and hate crimes.
In Tate's case, TikTok, Facebook and Instagram removed his content.
A TikTok spokesperson said "misogyny is a hateful ideology that is not tolerated on TikTok," and it continues to investigate other accounts and videos that violate its policies.
The spokesperson also said that TikTok is looking at ways to "strengthen enforcement" against this type of harmful content.
That includes a partnership with UN Women and other non-governmental organization seeking to stop Violence Against Women and Girls to launch a new in-app hub to educate users about gender-based violence.
Bateman says partnerships like these are essential in order for social media spaces to become safer and more educational, especially for young people.
Twitter has also taken action against controversial creators. The platform has issued temporary bans to creators like Jordan Peterson, Matt Walsh and Steven Crowder. (Each creator was later allowed back on the app.)
But Schroeder says bans can sometimes be counterintuitive. In Tate's case, it may have, in some ways, actually helped him.
"The bans are just drawing more attention to him," she said. "It's given him a very big microphone."
Bateman agrees, pointing out that these creators often find new apps, like Reddit, Gab, Telegram and Discord, to post their content.
She says some of these platforms are actually harder to monitor because of their closed-group structures or registration requirements, making it more difficult to study and track the content. A site for incel subculture, which promotes misogyny and violence, has upwards of 17,000 users, she found.
"It's such a complicated online world. It's fluid … it's moving. It's spreading around and these groups are interconnecting basically into one big cesspool of hate."