Wednesday, February 03, 2021

After Fueling A Genocide, Facebook Is Taking A Stand Against A Myanmar Coup

In an internal post, the company outlined how it will try to protect people opposing Myanmar’s military coup

Posted on February 2, 2021, 

Sopa Images / SOPA Images/LightRocket via Gett
A soldier stands on guard outside a Hindu temple in Yangon, Myanmar.

After failing to stem the hate speech and misinformation that fueled a genocide in Myanmar, Facebook now says it plans to take proactive content moderation steps following a military coup taking place in the country.

In an internal message posted late on Monday and viewed by BuzzFeed News, Rafael Frankel, a director of public policy in the Asia-Pacific region, told employees that the social network was watching the “volatile situation” in Myanmar “with grave concern” and outlined a series of measures to crack down on people who used it to spread misinformation or threaten violence.

As part of those measures, Facebook has designated Myanmar as a “Temporary High-Risk Location” for two weeks, allowing the company to remove content and events in the country that include “any calls to bring armaments.” The social network previously applied that designation to Washington, DC, following the insurrection at the US Capitol on Jan. 6.

The social network, which had touted its efforts to protect the integrity of Myanmar’s national elections in November, also said it would protect posts that criticized the military and its coup, and would track reports of pages and accounts being hacked or taken over by the military.

“Myanmar’s November election was an important moment in the country’s transition toward democracy, although it was not without its challenges, as highlighted by international human rights groups,” Frankel wrote. “This turn of events hearkens us to days we hoped were in Myanmar’s past and reminds us of fundamental rights that should never be taken for granted.”

“This turn of events hearkens us to days we hoped were in Myanmar’s past and reminds us of fundamental rights that should never be taken for granted.”

Facebook’s moves come after General Min Aung Hlaing, the head of Myanmar’s military, took control of the country’s government and detained its elected leader Aung San Suu Kyi and other members of her National League of Democracy (NLD) party on Monday. Following the election in which the NLD won a majority of seats in Myanmar’s parliament, military-backed opposition groups called the results fraudulent and demanded a revote.

On Tuesday, the US State Department officially designated the military’s takeover in Myanmar as a coup, triggering financial sanctions.

“After a review of all the facts, we have assessed that the Burmese military’s actions on February 1st, having deposed the duly elected head of government, constituted a military coup d’etat,” a State Department official said in a briefing, employing the name the US government uses to refer to the country.

In a statement to BuzzFeed News, Facebook confirmed the actions it outlined in Frankel’s post and said it would be removing content that praises or supports the coup.

“We’re putting the safety of people in Myanmar first and are removing content that breaks our rules on violence, hate speech and harmful misinformation,” Frankel said. “This includes removing misinformation that delegitimizes the outcome of November’s election.”

Facebook is taking action in a country where it has previously faced international condemnation for its handling of the displacement and genocide of Rohingya Muslims that began in 2016. In 2018, United Nations investigators found that senior military officials in Myanmar had used Facebook, which did not have content moderators in the country, to foment fear and spread hate speech.

The “extent to which Facebook posts and messages have led to real-world discrimination must be independently and thoroughly investigated,” the UN investigators concluded in their report.

In Monday’s post, Frankel said Facebook was using “a number of product interventions that were used in the past in Myanmar and during the US elections, to ensure the platform isn’t being used to spread misinformation, incite violence, or coordinate harm.”

The company is working to secure the accounts of activists and journalists “who are at risk or who have been arrested” and removing content that threatens or calls for violence against them, Frankel wrote. The company will also protect “critical information about what’s happening on the ground,” given the restrictions imposed on news outlets in the country.

Facebook’s work is an ongoing effort. On Tuesday, it removed a page for Myanmar’s military television network late Monday, following inquiries from the Wall Street Journal. While the company had banned one page for the Myawaddy television network in 2018 during a crackdown on hundreds of accounts tied to Myanmar’s military, a new page had reappeared and garnered 33,000 likes.

Facebook has frequently come under fire for facilitating the growth of violent and extremist groups and its ineffectiveness in stemming misinformation. Most recently, a tech watchdog group accused the company of fomenting the unrest that led to the deadly attempted coup in the United States.

“[Facebook] has spent the past year failing to remove extremist activity and election-related conspiracy theories stoked by President Trump that have radicalized a broad swath of the population and led many down a dangerous path,” the Tech Transparency Project (TTP) said in a report.

The report uncovered specific threats made in pro-Trump and militant groups on Facebook both before and after Joe Biden’s election victory in November.





Tasneem Nashrulla is a reporter for BuzzFeed News and is based in New York.


Ryan Mac is a senior tech reporter for BuzzFeed News and is based in San Francisco.


No comments:

Post a Comment