AFP
Tue, December 19, 2023
An independent oversight board says Meta removal of posts related to the war in the Middle East could be eliminating evidence of human rights violations in the process (Kirill KUDRYAVTSEV)
Meta's independent oversight board on Tuesday criticized the social media titan of removing posts that showed human suffering in the Middle East conflict.
The board, set up by Meta in 2020 as a supreme court of sorts for the social media titan, overturned two post removal decisions, and urged the company to respond more quickly to changing circumstances in the war between Hamas and Israel.
One case involved the removal by Instagram of a video showing what appeared to be the aftermath of a strike on or near Al-Shifa Hospital in Gaza City during a ground offensive by Israel.
The video showed Palestinians, including children, injured or killed, according to the board.
A second case involved Facebook's decision to remove a video of an Israeli woman begging her kidnappers not to kill her as she is taken hostage during Hamas raids on Israel on October 7, the board said.
"These decisions were very difficult to make," oversight board co-chair Michael McConnell said in a release.
"The board focused on protecting the right to the freedom of expression of people on all sides about these horrific events, while ensuring that none of the testimonies incited violence or hatred."
The board urged Meta to preserve any removed posts that might contain evidence of human rights violations.
Meta told the board that it temporarily lowered thresholds for automatic removal of posts with potentially harmful content after the Hamas-led attack on Israel, according to the overseers.
The use of automated tools for content moderation at Facebook and Instagram increase the likelihood of removing posts showing the harsh reality of what is happening in the conflict, according to the board.
"These testimonies are important not just for the speakers, but for users around the world who are seeking timely and diverse information about ground-breaking events," McConnell said.
"Some of which could be important evidence of potential grave violations of international human rights and humanitarian law."
Content decisions by the oversight board are binding, but its recommendations are not, according to Meta.
The conflict between Israel and Hamas has claimed many lives and arouses intense emotions around the world.
Social networks have been flooded with violent imagery along with fabricated content intended to misinform, in a challenge to online platforms.
The European Union in October sent Meta a request for information about the dissemination of violent and terrorist content on its platforms.
Similar investigations are targeting TikTok, owned by China-based ByteDance, and X, formerly known as Twitter.
Meta Oversight Board overturns decisions on removal of Israel-Gaza videos
Clyde Hughes
Tue, December 19, 2023
People walk by a sign on the Meta campus In Menlo Park, California on October 28, 2022. Meta's Oversight Board overruled Facebook's automated system on two videos connected with the Israel-Hamas conflict on Tuesday. File Photo by Terry Schmitt/UPI
Dec. 19 (UPI) -- Meta's Oversight Board ruled on Tuesday that its automated tools unnecessarily removed two videos posted on its social media platforms related to the Israel-Hamas war in its first expedited review ruling.
The Oversight Board, an independent organization that reviews Meta's content moderation decisions, overturned Meta's initial decisions to remove the posts from its platforms but approved its later moves to return the posts to the platforms with a warning screen.
The board said the first case involved a video posted to Facebook showing an Israeli woman begging her kidnappers not to kill her as she was taken hostage during the terrorist raids on Israel on Oct. 7. The second video posted to Instagram showed what appears to be the aftermath of a strike on or near Al-Shifa Hospital in Gaza City during Israel's ground offensive where Palestinians, including children, were killed or injured.
Meta said because of an "exceptional surge" in violent and graphic content since Oct. 7, it temporarily lowered the confidence thresholds for the automatic classification systems that identified content for its violence, hate speech, bullying and harassment.
As a result, the board said, Meta used its automated tools "more aggressively" to remove content that may violate those policies.
"While this reduced the likelihood that Meta would fail to remove violating content that might otherwise evade detection or where capacity for human review was limited, it also increased the likelihood of Meta mistakenly removing non-violating content related to the conflict," the board said.
The Oversight Board said the Al-Shifa case specifically showed that "insufficient human oversight of automated moderation during crisis response" could lead to posts that may be of "significant public interest" being incorrectly removed.
"Both the initial decision to remove this content as well as the rejection of the user's appeal were taken automatically based on a classifier score, without any human review," the board said, noting the response may have been "exacerbated" by Meta's decision to lower the removal threshold following the Oct. 7 attacks.
Additionally, the board said that both cases led Meta to demote the content from being recommended to other Facebook and Instagram users "even though the company had determined that the posts intended to raise awareness."
In the hostage video case, the board agreed that Meta should make its "default approach" to protect the safety and dignity of hostages by removing such videos, it was justified in its later decision to allow the content with a warning screen for the purposes of condemning the actions depicted, raising awareness and news reporting or calling for release.
"Indeed, given the fast-moving circumstances and the high costs to freedom of expression and access to information for removing this kind of content, Meta should have moved more quickly to adapt its policy," the board said.
The board noted that Meta began allowing hostage-taking content to be shared on its platform on Oct. 20 from accounts on the company's "cross-check lists" and was later expanded to all accounts until Nov. 16 for posts shared after that date.
It said, however, the practice highlighted concerns about the cross-check "including the unequal treatment of users, lack of transparent criteria for inclusion and the need to ensure greater representation of users whose content is likely to be important from a human-rights perspective on Meta's cross-check lists.
"The use of the cross-check program in this way also contradicts how Meta has described and explained the purpose of the program, as a mistake prevention system and not a program that provides certain privileged users with more permissive rules," the Oversight Board said.
The case was the first expedited review taken by the board, and was completed in 12 days, less than half of the 30-day limit for a decision required by the expedited process.
Meta Oversight Board says Israel-Hamas videos should not have been removed
Tue, December 19, 2023
FILE PHOTO: The logo of Meta Platforms' business group
By Sheila Dang
(Reuters) - Meta Platform's Oversight Board said on Tuesday that the social media company erred in removing two videos depicting hostages and injured people in the Israel-Hamas conflict, saying the videos were valuable to understanding human suffering in the war.
Since Hamas' attack in Israel on Oct. 7, social media platforms have seen renewed scrutiny over their content moderation practices due to a surge in misinformation and accusations that the companies have promoted certain viewpoints about the conflict.
The videos about the conflict are the first time that the Oversight Board, an independent body that reviews content decisions on Meta's Facebook and Instagram, examined cases on an expedited basis. The board announced the faster review process earlier this year to respond more quickly to urgent events.
One of the cases concerned a video posted on Instagram, which showed the aftermath of an airstrike near Al Shifa hospital in Gaza, including children who appeared injured or dead.
The second case involved a video of the Oct. 7 attack, which showed an Israeli woman begging her kidnappers not to kill her as she is taken hostage.
In both instances, after the Oversight Board selected the content takedowns for review, Meta reversed its decision and restored the videos with a screen that warned viewers before viewing, the board said.
The board said it approved of the move to restore the content, but disagreed with Meta's decision to restrict the videos from being recommended to users, and in a statement urged Meta to "respond more quickly to changing circumstances on the ground, which affect the balance between the values of voice and safety."
A Meta spokesperson said the company welcomed the board's decisions, adding no further action would be taken on the cases since the board did not make any policy recommendations.
(Reporting by Sheila Dang in Austin; Editing by Stephen Coates)
Meta's initial decisions to remove 2 videos of Israel-Hamas war reversed by Oversight Board
Tue, December 19, 2023
People walk by a sign on the Meta campus In Menlo Park, California on October 28, 2022. Meta's Oversight Board overruled Facebook's automated system on two videos connected with the Israel-Hamas conflict on Tuesday. File Photo by Terry Schmitt/UPI
Dec. 19 (UPI) -- Meta's Oversight Board ruled on Tuesday that its automated tools unnecessarily removed two videos posted on its social media platforms related to the Israel-Hamas war in its first expedited review ruling.
The Oversight Board, an independent organization that reviews Meta's content moderation decisions, overturned Meta's initial decisions to remove the posts from its platforms but approved its later moves to return the posts to the platforms with a warning screen.
The board said the first case involved a video posted to Facebook showing an Israeli woman begging her kidnappers not to kill her as she was taken hostage during the terrorist raids on Israel on Oct. 7. The second video posted to Instagram showed what appears to be the aftermath of a strike on or near Al-Shifa Hospital in Gaza City during Israel's ground offensive where Palestinians, including children, were killed or injured.
Meta said because of an "exceptional surge" in violent and graphic content since Oct. 7, it temporarily lowered the confidence thresholds for the automatic classification systems that identified content for its violence, hate speech, bullying and harassment.
As a result, the board said, Meta used its automated tools "more aggressively" to remove content that may violate those policies.
"While this reduced the likelihood that Meta would fail to remove violating content that might otherwise evade detection or where capacity for human review was limited, it also increased the likelihood of Meta mistakenly removing non-violating content related to the conflict," the board said.
The Oversight Board said the Al-Shifa case specifically showed that "insufficient human oversight of automated moderation during crisis response" could lead to posts that may be of "significant public interest" being incorrectly removed.
"Both the initial decision to remove this content as well as the rejection of the user's appeal were taken automatically based on a classifier score, without any human review," the board said, noting the response may have been "exacerbated" by Meta's decision to lower the removal threshold following the Oct. 7 attacks.
Additionally, the board said that both cases led Meta to demote the content from being recommended to other Facebook and Instagram users "even though the company had determined that the posts intended to raise awareness."
In the hostage video case, the board agreed that Meta should make its "default approach" to protect the safety and dignity of hostages by removing such videos, it was justified in its later decision to allow the content with a warning screen for the purposes of condemning the actions depicted, raising awareness and news reporting or calling for release.
"Indeed, given the fast-moving circumstances and the high costs to freedom of expression and access to information for removing this kind of content, Meta should have moved more quickly to adapt its policy," the board said.
The board noted that Meta began allowing hostage-taking content to be shared on its platform on Oct. 20 from accounts on the company's "cross-check lists" and was later expanded to all accounts until Nov. 16 for posts shared after that date.
It said, however, the practice highlighted concerns about the cross-check "including the unequal treatment of users, lack of transparent criteria for inclusion and the need to ensure greater representation of users whose content is likely to be important from a human-rights perspective on Meta's cross-check lists.
"The use of the cross-check program in this way also contradicts how Meta has described and explained the purpose of the program, as a mistake prevention system and not a program that provides certain privileged users with more permissive rules," the Oversight Board said.
The case was the first expedited review taken by the board, and was completed in 12 days, less than half of the 30-day limit for a decision required by the expedited process.
Meta Oversight Board says Israel-Hamas videos should not have been removed
Tue, December 19, 2023
FILE PHOTO: The logo of Meta Platforms' business group
By Sheila Dang
(Reuters) - Meta Platform's Oversight Board said on Tuesday that the social media company erred in removing two videos depicting hostages and injured people in the Israel-Hamas conflict, saying the videos were valuable to understanding human suffering in the war.
Since Hamas' attack in Israel on Oct. 7, social media platforms have seen renewed scrutiny over their content moderation practices due to a surge in misinformation and accusations that the companies have promoted certain viewpoints about the conflict.
The videos about the conflict are the first time that the Oversight Board, an independent body that reviews content decisions on Meta's Facebook and Instagram, examined cases on an expedited basis. The board announced the faster review process earlier this year to respond more quickly to urgent events.
One of the cases concerned a video posted on Instagram, which showed the aftermath of an airstrike near Al Shifa hospital in Gaza, including children who appeared injured or dead.
The second case involved a video of the Oct. 7 attack, which showed an Israeli woman begging her kidnappers not to kill her as she is taken hostage.
In both instances, after the Oversight Board selected the content takedowns for review, Meta reversed its decision and restored the videos with a screen that warned viewers before viewing, the board said.
The board said it approved of the move to restore the content, but disagreed with Meta's decision to restrict the videos from being recommended to users, and in a statement urged Meta to "respond more quickly to changing circumstances on the ground, which affect the balance between the values of voice and safety."
A Meta spokesperson said the company welcomed the board's decisions, adding no further action would be taken on the cases since the board did not make any policy recommendations.
(Reporting by Sheila Dang in Austin; Editing by Stephen Coates)
Meta's initial decisions to remove 2 videos of Israel-Hamas war reversed by Oversight Board
Associated Press
Updated Tue, December 19, 2023
Meta's logo is seen on a sign at the company's headquarters in Menlo Park, Calif., Nov. 9, 2022. A quasi-independent review board is recommending that Facebook parent company Meta overturn two decisions it made this fall to remove posts “informing the world about human suffering on both sides” of the Israel-Hamas war. (AP Photo/Godofredo A. Vásquez, File)
MENLO PARK, Calif. (AP) — A quasi-independent review board has ruled that Facebook parent company Meta should overturn two decisions it made this fall to remove posts “informing the world about human suffering on both sides” of the Israel-Hamas war.
In both cases, Meta ended up reinstating the posts — one showing Palestinian casualties and the other, an Israeli hostage — on its own, although it added warning screens to both due to violent content. This means the company isn't obligated to do anything about the board's decision.
That said, the board also said it disagrees with Meta's decision to bar the posts in question from being recommended by Facebook and Instagram, “even in cases where it had determined posts intended to raise awareness.” And it said Meta's use of automated tools to remove “potentially harmful” content increased the likelihood of taking down “valuable posts” that not only raise awareness about the conflict but may contain evidence of human rights violations. It urged the company to preserve such content.
The Oversight Board, established three years ago by Meta, issued its decisions Tuesday in what it said was its first expedited ruling — taking 12 days rather than the usual 90.
In one case, the board said, Instagram removed a video showing what appears to be the aftermath of a strike on or near Al-Shifa Hospital in Gaza City. The post shows Palestinians, including children, injured or killed. Meta's automated systems removed the post saying it violated its rules against violent and graphic content. While Meta eventually reversed its decision, the board said, it placed a warning screen on the post and demoted it, which means it was not recommended to users and fewer people saw it. The board said it disagrees with the decision to demote the video.
The other case concerns video posted to Facebook of an Israeli woman begging her kidnappers not to kill her as she is taken hostage during the Hamas raids on Israel on Oct. 7.
Users appealed Meta's decision to remove the posts and the cases went to the Oversight Board. The board said it saw an almost three-fold increase in the daily average of appeals marked by users as related to the Middle East and North Africa region in the weeks following Oct. 7.
Meta said it welcomes the board's decision.
“Both expression and safety are important to us and the people who use our services. The board overturned Meta’s original decision to take this content down but approved of the subsequent decision to restore the content with a warning screen. Meta previously reinstated this content so no further action will be taken on it,” the company said. “There will be no further updates to this case, as the board did not make any recommendations as part of their decision.”
In a briefing on the cases, the board said Meta confirmed it had temporarily lowered thresholds for automated tools to detect and remove potentially violating content.
“While reducing the risk of harmful content, it also increased the likelihood of mistakenly removing valuable, non-violating content from its platforms,” the Oversight Board said, adding that as of Dec. 11, Meta had not restored the thresholds to pre-Oct. 7 levels.
Meta, then called Facebook, launched the Oversight Board in 2020 in response to criticism that it wasn’t moving fast enough to remove misinformation, hate speech and influence campaigns from its platforms. The board has 22 members, a multinational group that includes legal scholars, human rights experts and journalists.
The board’s rulings, such as in these two cases, are binding but its broader policy findings are advisory and Meta is not obligated to follow them.
Meta removed two Israel-Hamas videos unnecessarily, Oversight Board says
Clare Duffy, CNN
Tue, December 19, 2023
Facebook-parent Meta’s automated tools to police potentially harmful content unnecessarily removed two videos related to the Israel-Hamas war, the Meta Oversight Board said in a statement Tuesday. The moderation technology may have prevented users from viewing content related to human suffering on both sides of the conflict, it said.
The comments are the result of the Oversight Board’s first “expedited review,” highlighting the intense scrutiny facing social media companies over their handling of content related to the conflict.
The board overturned Meta’s original decision to remove two pieces of content. As part of the decision, the group urged Meta to respect users’ rights to “freedom of expression … and their ability to communicate in this crisis.”
“The Board focused on protecting the right to the freedom of expression of people on all sides about these horrific events, while ensuring that none of the testimonies incited violence or hatred,” Michael McConnell, a co-chair of the board, said in a statement. “These testimonies are important not just for the speakers, but for users around the world who are seeking timely and diverse information about ground-breaking events.”
In response to the board’s decision, Meta said that because it had already reinstated the two pieces of content prior to the board’s decision, it would take no further action. “Both expression and safety are important to us and the people who use our services,” the company said in a blog post.
Meta’s Oversight Board is an entity made up of experts in areas such as freedom of expression and human rights. It is often described as a kind of Supreme Court for Meta, as it allows users to appeal content decisions on the company’s platforms. The board makes recommendations to the company about how to handle certain content moderation decisions, as well as broader policy suggestions.
The board said earlier this month that it decided to take up a faster review in this case because content decisions related to the war could have “urgent real-world consequences.” In the weeks after the Israel-Hamas conflict broke out, the board said it saw a nearly three-fold increase in daily average user appeals of decisions on content “related to the Middle East and North Africa region.”
Meta told CNN in October that it had established “a special operations center staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation,” and that it was coordinating with third-party fact checkers in the region.
The Oversight Board said Tuesday that following the conflict’s outbreak, Meta put in place temporary measures to address potentially dangerous content, including lowering the thresholds for automatic removal of content that could violate its hate speech, violence and incitement, and bullying and harassment policies.
“In other words, Meta used its automated tools more aggressively to remove content that might be prohibited,” the board said, adding that the company took those steps to prioritize safety but that the move also “increased the likelihood of Meta mistakenly removing non-violating content related to the conflict.” The board said that as of December 11, Meta had not returned the content moderation thresholds for its automated systems to normal levels.
The board’s review looked at two pieces of content: a video posted to Instagram that appeared to show the aftermath of a strike outside the Al-Shifa Hospital in Gaza City and another video posted to Facebook showing two hostages being kidnapped by Hamas militants.
The first video appeared to show “people, including children, injured or dead, lying on the ground and/or crying.” A caption under the video in Arabic and English referenced the Israeli army, stating that the hospital had been “targeted by the ‘usurping occupation,’” the board said.
Meta’s automated systems initially removed the post for violating its rules on graphic and violent content. A user appealed the decision, asking for the video to be restored, which was automatically rejected by Meta’s systems after they determined with “a high confidence level” that the content violated its rules. After the board decided to take up the case, Meta made the video viewable with a warning that the content is disturbing; the warning also prevents the video from being viewed by minors and from bei ng recommended to adult users.
The Oversight Board said Tuesday that the video should not have been removed in the first place, and criticized Meta’s move to limit the video’s circulation, saying it “does not accord with the company’s responsibilities to respect freedom of expression.”
The second video reviewed by the board showed a woman on a motorbike and a man being marched away by Hamas militants, with a caption urging people to watch to gain a “deeper understanding” of the October 7 attack on Israel.
Meta initially removed the post for violating its dangerous organizations and individuals policy, which prohibits imagery of terror attacks on visible victims even if shared to raise awareness of such an attack. (Meta designates Hamas as a dangerous organization under its policy and labeled the October 7 attack as a terrorist attack.)
The company reinstated the video with a warning screen after the board took up the case, part of a larger move to allow limited exemptions to its dangerous organizations and individuals policy in cases where content was meant to condemn, raise awareness for or report on the kidnappings, or to call for the release of the hostages. Like with the other video, the warning screen restricted the visibility of the video for minors and prevented it from being recommended to other Facebook users.
As in the case of the first video, the board said the content should not have been removed and said that preventing the video from being recommended puts Meta out of compliance with its human rights responsibilities.
Meta said Tuesday that it would not change its limits on recommending both videos reviewed by the board, because the board disagreed with the limits but did not make a formal recommendation about how they should be handled.
“The Board finds that excluding content raising awareness of potential human-rights abuses, conflicts, or acts of terrorism from recommendations is not a necessary or proportionate restriction on freedom of expression, in view of the very high public interest in such content,” it said in its decision.
For more CNN news and newsletters create an account at CNN.com
No comments:
Post a Comment