The murder of a young girl in Ethiopia reveals TikTok’s content moderation failures
In July 2023, the city of Bahir Dar in Ethiopia's Amhara Region was left reeling after the brutal rape and murder of a seven-year-old girl named Heaven. The tragedy cast a deep shadow over the community, and the perpetrator — her mother's landlord — was sentenced to 25 years in prison without the possibility of parole. Nearly a year later, as the convicted man sought to appeal the sentence, the victim’s mother, Abekyelesh Adeba, turned to YouTube to share the agonizing story of her daughter’s death.
This crime unfolded in a region already burdened by escalating ethnic tensions. From 2020 to 2022, Ethiopia was engulfed in a brutal conflict between the federal government and the Tigray People’s Liberation Front (TPLF). The war led to widespread suffering and displacement, intensifying long-standing animosities between the Amhara and Tigrayan communities — neighboring groups with a complicated and often troubled history.
In the aftermath of the conflict, the region has continued to grapple with an armed insurgency, with some Amhara political leaders accusing the federal government of deliberately targeting their people.
These simmering tensions have increasingly spilled into the digital realm, where leaders and influencers from both the region and the diaspora frequently disseminate graphic content and incendiary rhetoric on social media.
In this volatile environment, the mother’s emotional testimony quickly went viral on Ethiopia’s social media platforms, sparking widespread outrage. The hashtag #JusticeforHeaven, named in memory of the victim, quickly amassed millions of views on TikTok.
These videos, which featured users expressing their anger over the crime and demanding justice for Heaven, became a focal point of the online conversation. Many people were outraged by the criminal’s attempt to appeal his conviction, criticizing the judicial process and pointing out perceived flaws in the case.
The situation garnered further attention when American rapper Cardi B urged her followers on X to sign a petition on Change.org demanding justice.
However, the responses online were not unanimous. Some questioned the authenticity of the mother, Abekyelesh Adeba's account, suggesting it was part of a conspiracy to distract the Amhara people from their ongoing struggles against the Ethiopian federal government. One of the most prominent voices casting doubt on Adeba's story was “Mota Keraniyo,” a U.S.-based TikTok user. The day after her testimony, Mota posted inflammatory videos filled with Amhara-nationalist rhetoric, attacking both the victim and her mother.
In a shocking turn, he even claimed that the mother deserved to be raped because her daughter was fathered by a Tigrayan.
In response to Mota’s inflammatory remarks, a collective response emerged on TikTok, both within Ethiopia and among diaspora communities. As tensions grew, many TikTok users, motivated by longstanding grievances against Mota, organized coordinated reporting campaigns and hosted online gatherings that drew thousands of participants.
The atmosphere was marked by anger and frustration as users sought to have Mota’s content removed from the platform.
Despite widespread efforts to report his content, the results were only partially effective.
While TikTok reportedly removed some of Mota's most extreme posts, he continued to reach his 420.9k followers. In a follow-up video, he issued an apology for his earlier remarks about the child Heaven Awot, framing the backlash against himself as an attempt to silence his advocacy for the Amhara community.
He urged his followers to support him by subscribing to his account and engaging with more of his content. Shortly thereafter, he shifted focus, advocating violence against the perpetrator, his relatives, and anyone associated with him, including his lawyer.
TikTok’s algorithm, which is designed to amplify content that garners engagement — whether positive or negative — further propelled Mota’s violent rhetoric, leading to its widespread dissemination across the platform.
His messages rapidly extended beyond TikTok, appearing on other social media platforms such as YouTube, Facebook, X (formerly Twitter) and Telegram and even making their way to podcasts, talk radio, and television. Among this content were AI-generated videos that falsely posed as news reports, claiming authorities were investigating both the perpetrator and Mota, as well as others who spread conspiracy theories. Some users embraced these AI-generated videos as truthful, believing their online activism had effectively brought about justice.
Despite TikTok's policy prohibiting AI-generated content featuring individuals under 18 — whether real or fictional — videos featuring the likeness of the victim, Heaven Awot proliferated across TikTok and other platforms without being removed, in clear violation of the platform's guidelines. Throughout this surge in content, TikTok provided minimal moderation.
Mota’s rhetoric did not exist in isolation. Other figures in the diaspora and within Ethiopia followed suit, spreading misinformation and conspiracy theories. One such figure was Bethlehem Dagnachew, a former singer living in Switzerland, who combined nationalist rhetoric with false claims about the incident. Her disinformation campaign persisted until public pressure compelled her to retract her claims and issue condolences to the victim's mother.
In Addis Ababa, Ethiopia’s capital, another TikTok user who denied the crime’s authenticity and insulted the victim was violently attacked by locals, who compelled him to issue an apology.
The spread of Mota’s violent messages, the actions of his followers, and the proliferation of fake AI-generated videos underscore a broader issue: the rapid metastasis of harmful content in African languages across social media platforms. Despite years of efforts by the media, academics, and social media companies themselves to address this problem, such content has become more pervasive and widespread than ever, posing a significant challenge to content moderation and platform accountability in under-resourced languages across Sub-Saharan Africa.