Wednesday, June 23, 2021


Rights group: Facebook amplified Myanmar military propaganda





Facebook-MyanmarFILE- In this March 29, 2018, file photo, the logo for Facebook appears on screens at the Nasdaq MarketSite in New York's Times Square. Facebook's recommendation algorithm amplifies military propaganda and other material that breaches the company's own policies in Myanmar even though the social media giant says it’s treating the situation there as an emergency following February's military coup, a new report by the rights group Global Witness has found. A month after the military seized power in Myanmar and imprisoned elected leaders, Facebook's algorithms were still prompting users to view and “like” pro-military pages with posts that incited and threatened violence, pushed misinformation that could lead to physical harm, praised the military and glorified its abuses, Global Witness said in the report, published late Tuesday, June 22, 2021. (AP Photo/Richard Drew, File)More


BARBARA ORTUTAY
Tue, June 22, 2021

Facebook's recommendation algorithm amplifies military propaganda and other material that breaches the company's own policies in Myanmar following a military takeover in February, a new report by the rights group Global Witness says.

A month after the military seized power in Myanmar and imprisoned elected leaders, Facebook's algorithms were still prompting users to view and “like” pro-military pages with posts that incited and threatened violence, pushed misinformation that could lead to physical harm, praised the military and glorified its abuses, Global Witness said in the report, published late Tuesday.

That's even though the social media giant vowed to remove such content following the coup, announcing it would remove Myanmar military and military-controlled pages from its site and from Instagram, which it also owns. It has since enacted other measures intended to reduce offline harm in the country.

Facebook said Tuesday its teams “continue to closely monitor the situation in Myanmar in real-time and take action on any posts, Pages or Groups that break our rules.”

Days after the Feb. 1 coup, the military temporarily blocked access to Facebook because it was being used to share anti-coup comments and organize protests. Access was later restored. In the following weeks, Facebook continued to tighten its policies against the military, banning all military entities from its platforms and saying it would remove praise or support for violence against citizens and their arrest.

“Once again, Facebook shows that it’s good at making broad sweeping announcements and bad at actually enforcing them. They’ve had years to improve their work in Myanmar but once again they are still failing,” said Sophie Zhang, a former Facebook data scientist and whistleblower who found evidence of political manipulation in countries such as Honduras and Azerbaijan while she worked there.

The struggle between the military regime that deposed Aung San Suu Kyi’s elected government and those opposing it has sharpened in recent months.

Soldiers and police have killed hundreds of protesters. Last week, the United Nations’ office in Myanmar expressed concern about escalating human rights abuses after reports that a group opposed to the junta may have executed 25 civilians it captured and allegations that troops had burned down a village.

Myanmar, also known as Burma, had over 22.3 million Facebook users in January 2020, more than 40% of its population, according to social media management platform NapoleonCat.

“What happens on Facebook matters everywhere, but in Myanmar that is doubly true," the report says. As in many countries outside the Western Hemisphere, mobile phones in Myanmar often come pre-loaded with Facebook and many businesses do not have a website, only a Facebook page. For many people in the country, Facebook effectively is the internet.

On March 23, just before the peak of military violence against civilians, Global Witness said it set up a new, clean Facebook account with no history of liking or following specific topics and searched for “Tatmadaw”, the Burmese name for the armed forces. It filtered the search results to show pages, and selected the top result — a military fan page whose name translates as “a gathering of military lovers.”

Older posts on this page showed sympathy for Myanmar's soldiers and at least two advertised for young people to join the military — but none of the newer posts since the coup violated Facebook's policies. However, when Global Witness's account “liked" the page, Facebook began recommending related pages with material inciting violence, false claims of interference in last year's election and support of violence against civilians.

A March 1 post, for instance, includes a death threat against protesters who vandalize surveillance cameras.

“Those who threaten female police officers from the traffic control office and violently destroy the glass and destroy CCTV, those who cut the cables, those who vandalize with color sprays, (we) have been given an order to shoot to kill them on the spot,” reads part of the post in translation, according to the report. “Saying this before Tatmadaw starts doing this. If you don’t believe and continue to do this, go ahead. If you are not afraid to die, keep going.”

Facebook said its ban of the Tatmadaw and other measures have “made it harder for people to misuse our services to spread harm. This is a highly adversarial issue and we continue to take action on content that violates our policies to help keep people safe.”

Global Witness said its findings show that Facebook fails to uphold the “very basics" of its own guidelines.

“The platform operates too much like a walled garden, its algorithms are designed, trained, and tweaked without adequate oversight or regulation," said Naomi Hirst, head of the digital threats campaign at Global Witness. “This secrecy has to end, Facebook must be made accountable.”

No comments:

Post a Comment