Meta’s oversight board urges Facebook, Instagram to lift ban on ‘shaheed’
The advisory board has told Meta the ban on the word, meaning ‘martyr’, restricts free expression and is unnecessary.
The advisory board has told Meta the ban on the word, meaning ‘martyr’, restricts free expression and is unnecessary.
By Abid Hussain
Published On 30 Mar 2024
The oversight board of Meta, the social media giant which owns Facebook, Instagram and WhatsApp, has ruled that a ban on the use of the word “shaheed” – “martyr” in Arabic – should be lifted. Meta has acknowledged that the term “shaheed” accounts for more content removals under the company’s content moderation policy than any other single word or phrase on its platforms.
In a policy advisory note, the company’s oversight board stated: “The Board has found that Meta’s current approach disproportionately restricts free expression, is unnecessary, and that the company should end this blanket ban.”
Published On 30 Mar 2024
The oversight board of Meta, the social media giant which owns Facebook, Instagram and WhatsApp, has ruled that a ban on the use of the word “shaheed” – “martyr” in Arabic – should be lifted. Meta has acknowledged that the term “shaheed” accounts for more content removals under the company’s content moderation policy than any other single word or phrase on its platforms.
In a policy advisory note, the company’s oversight board stated: “The Board has found that Meta’s current approach disproportionately restricts free expression, is unnecessary, and that the company should end this blanket ban.”
Meta’s oversight board was established in 2020. It is funded by Meta but operates independently of the company. When Facebook and Instagram make decisions to remove certain content from their platforms, Meta can ask the board to review those decisions, particularly when they cause controversy. The board effectively acts as an ombudsman which makes recommendations and issues rulings either endorsing or overruling such decisions made by Meta.
Here is what we know about the recommendation made by the oversight board and how it came to its decision.
Here is what we know about the recommendation made by the oversight board and how it came to its decision.
Why does Meta remove content containing the word ‘shaheed’?
Meta’s current content moderation policy considers that the term “shaheed” is used as “praise” when it is mentioned in relation to organisations which have been included on its Dangerous Organizations and Individuals (DOI) list.
The top tier of this list includes what it terms “hate organisations; criminal organisations, including those designated by the United States government”. According to Meta, these are individuals and organisations which are deemed to be engaging in “serious offline harm”.
Why did Meta ask for advice about the word ‘shaheed’?
The policy advisory from the oversight board comes after repeated criticism levelled against Meta over its approach towards content posted by Palestinian and Arabic speakers.
Most recently for example, in December last year, Human Rights Watch issued a report which concluded that Meta’s content moderation policies amounted to censorship of content relating to the continuing Israel-Palestine conflict.
In a 51-page report, the human rights group said that Meta had misused its DOI policy to “restrict legitimate speech around hostilities between Israel and Palestinian armed groups”.
Meta began its own internal dialogue in 2020 over its approach to the use of the term “shaheed” on its platforms but failed to reach a consensus.
An independent investigation launched by the group in 2021 found the company’s content moderation policies “appear to have had an adverse human rights impact on the rights of Palestinian users”, and were adversely affecting “the ability of Palestinians to share information and insights about their experiences as they occurred”.
In February last year, therefore, Meta asked the oversight board to provide a policy advisory about whether it should continue to remove content using the Arabic term in reference to individuals or groups designated under its DOI policy.
Palestinian activists and journalists protest against what they consider censorship of Palestinian content by Facebook, in the occupied West Bank city of Hebron, on November 24, 2021
[Hazem Bader/AFP]
How did the oversight board go about considering this issue?
Nighat Dad, a member of the oversight board, told Al Jazeera that Meta suggested multiple options for the board to consider, including maintaining the status quo, but the board was not bound by those options and also explored other avenues after “extensive, more than a yearlong deliberation”.
She added that the group’s discussion on the usage of “shaheed” involved testing out the recommendations in real-life situations after the war started in October last year.
“We wanted to see how people will use Meta platforms and did our research to see people’s usage. We found out that our recommendations held up even under the conditions of the current conflict,” she said.
What did the oversight board recommend?
In its report, which was issued on March 26, the oversight board said Meta’s current approach to the term “shaheed” is “over-broad, and substantially and disproportionately restricts free expression”.
The report further added that Meta had failed to comprehend the term’s “linguistic complexity”, saying its content moderation policies only treated it as the equivalent of the English word “martyr”.
The board observed that Meta operated on a presumption that reference to any individual or organisation on the designated list “always constitutes praise” under the company’s DOI policy, leading to a blanket ban.
“Doing so substantially affects freedom of expression and media freedoms, unduly restricts civic discourse and has serious negative implications for equality and non-discrimination,” it added.
Dad said discussions within the board were extensive as the group explored the use of the term in different contexts and “paid extremely close attention to potential for real-world harm with any policy change”.
“We, as board, ultimately decided that Meta’s approach to tackle the word was counterproductive, which often affected journalists from reporting on armed groups as well as limited people’s ability to debate and condemn violence,” she said.
Are recommendations from the oversight board binding?
Meta said it would review the board’s recommendations and respond within 60 days. However, the board’s recommendations in this matter are not binding.
“Our decisions on any matter related to Meta are binding, but when it comes to policy advisory which is sought by Meta itself, they are not,” Dad explained.
However, she added, the board has a “robust mechanism” through which it can follow up and ensure that implementation of the recommendation is considered.
“We have an implementation committee, and we regularly reach out to Meta to follow up on what they have done with our advisory opinion,” she said.
KEEP READING
SOURCE: AL JAZEERA
No comments:
Post a Comment