Company Should Address Root Causes of Censorship of Palestine Content
Deborah Brown
Deputy Director, Technology, Rights & Investigations
Click to expand Image
Crowds gathered outside of Meta headquarters to protest the company’s censorship of posts on social platforms related to Palestine, in Menlo Park, California, US, November 3, 2023.
© 2023 Tayfun Coskun/Anadolu via Getty Images
Earlier this month, Meta's Oversight Board found that three Facebook posts containing the phrase “From the River to the Sea” did not violate Meta’s content rules and should remain online.
The majority of the Oversight Board members concluded that the phrase, widely used at protests to show solidarity with Palestinians, is not inherently a violation of Meta’s policies on Hate Speech, Violence and Incitement, or Dangerous Organizations and Individuals (DOI). In line with Human Rights Watch’s submission, it affirmed that while the phrase can have different meanings, it amounts to protected speech under international human rights law and should not, on its own, be a basis for removal, enforcement, or review of content under Meta's policies. Meta created the board as an external body to appeal moderation decisions and provide non-binding policy guidance.
A minority of board members recommended imposing a blanket ban on use of the phrase unless there are clear signals it does not constitute glorification of Hamas. Such a ban would be inconsistent with international human rights standards, amounting to an excessive restriction on protected speech.
The board’s decision upholds free expression, but Meta has a broader problem of censoring protected speech about Palestine on its platforms. A 2023 Human Rights Watch report found that Meta was systemically censoring Palestine content and that broad restrictions on content relating to groups that Meta puts on its DOI list often resulted in the censorship of protected speech. Meta has said that core human rights principles have guided its crisis response measures since October 7. But its heavy reliance on automated detection systems fails to accurately assess context, even when posts explicitly oppose violence.
For instance, on July 19, Human Rights Watch posted a video on Instagram and Facebook with a caption in Arabic that read: “Hamas-led armed groups committed numerous war crimes and crimes against humanity against civilians during the October 7 assault on southern Israel.” Meta’s automated tools “incorrectly” removed the post for violating its DOI policy. Formal appeals were unsuccessful, and the content was only restored after informal intervention.
Meta should address the systemic issues at the heart of its wrongful removal of protected speech about Palestine. Amending its flawed policies, strengthening context-based review, and providing more access to data to facilitate independent research are essential to protecting free expression on its platforms.
Earlier this month, Meta's Oversight Board found that three Facebook posts containing the phrase “From the River to the Sea” did not violate Meta’s content rules and should remain online.
The majority of the Oversight Board members concluded that the phrase, widely used at protests to show solidarity with Palestinians, is not inherently a violation of Meta’s policies on Hate Speech, Violence and Incitement, or Dangerous Organizations and Individuals (DOI). In line with Human Rights Watch’s submission, it affirmed that while the phrase can have different meanings, it amounts to protected speech under international human rights law and should not, on its own, be a basis for removal, enforcement, or review of content under Meta's policies. Meta created the board as an external body to appeal moderation decisions and provide non-binding policy guidance.
A minority of board members recommended imposing a blanket ban on use of the phrase unless there are clear signals it does not constitute glorification of Hamas. Such a ban would be inconsistent with international human rights standards, amounting to an excessive restriction on protected speech.
The board’s decision upholds free expression, but Meta has a broader problem of censoring protected speech about Palestine on its platforms. A 2023 Human Rights Watch report found that Meta was systemically censoring Palestine content and that broad restrictions on content relating to groups that Meta puts on its DOI list often resulted in the censorship of protected speech. Meta has said that core human rights principles have guided its crisis response measures since October 7. But its heavy reliance on automated detection systems fails to accurately assess context, even when posts explicitly oppose violence.
For instance, on July 19, Human Rights Watch posted a video on Instagram and Facebook with a caption in Arabic that read: “Hamas-led armed groups committed numerous war crimes and crimes against humanity against civilians during the October 7 assault on southern Israel.” Meta’s automated tools “incorrectly” removed the post for violating its DOI policy. Formal appeals were unsuccessful, and the content was only restored after informal intervention.
Meta should address the systemic issues at the heart of its wrongful removal of protected speech about Palestine. Amending its flawed policies, strengthening context-based review, and providing more access to data to facilitate independent research are essential to protecting free expression on its platforms.
No comments:
Post a Comment