Wednesday, July 28, 2021


‘Disinfo kills’: protesters demand Facebook act to stop vaccine falsehoods

Activists place body bags at company’s Washington HQ

False information on vaccines shared widely on network


Protesters are urging Facebook’s shareholders to ban misinformation ‘superspreaders’. Photograph: Eric Kayne/AP

Kari Paul in San Francisco
Wed 28 Jul 2021 

Activists descended on Facebook’s Washington headquarters on Wednesday to demand the company take stronger action against vaccine falsehoods spreading on its platform, covering the area in front of Facebook’s office with body bags that read “disinfo kills”.


‘A systemic failure’: vaccine misinformation remains rampant on Facebook, experts say

The day of protest, which comes as Covid cases surge in the US, has been organized by a group of scholars, advocates and activists calling themselves the “Real” Oversight Board. The group is urging Facebook’s shareholders to ban so-called misinformation “superspreaders” – the small number of accounts responsible for the majority of false and misleading content about the Covid-19 vaccines.

“People are making decisions based on the disinformation that’s being spread on Facebook,” said Shireen Mitchell, Member of the Real Facebook Oversight Board and founder of Stop Online Violence Against Women. “If Facebook is not going to take that down, or if all they’re going to do is put out disclaimers, then fundamentally Facebook is participating in these deaths as well.”

In coordination with the protest, the Real Oversight Board has released a new report analyzing the spread of anti-vaccine misinformation on Facebook during the company’s most recent financial quarter. The report and protest also come as Facebook prepares to announce its financial earnings for that same quarter.

The report references a March study from the Center for Countering Digital Hate (CCDH) that found a small group of accounts – known as the “dirty dozen” – is responsible for more than 73% of anti-vaccine content across social media platforms, including Facebook. That report recently drew attention from the White House, and Joe Biden has condemned Facebook and other tech companies for failing to take action.

Facebook banned misinformation about vaccines from the platform in February of 2021, but critics say many posts slip through the platform’s filters and reach audiences of millions without being removed.
At Facebook’s Washington DC headquarters, activists lay body bags that read “disinfo kills”. Photograph: Eric Kayne/AP

It also has introduced a number of rules relating to Covid-19 specifically, banning posts that question the severity of the disease, deny its existence, or argue that the vaccine has more risks than the virus. Still, the Real Oversight Board found that often such content has been able to remain on the platform and even make its way into the most-shared posts.

According to the Real Oversight Board’s report, a large share of the misinformation about the Covid vaccines comes from a few prolific accounts, and continues to be among the platform’s best performing and most widely shared content. It analyzed the top 10 posts on each weekday over the last quarter and found the majority of those originated from just five identified “superspreaders” of misinformation.

“When it comes to Covid disinformation, the vast majority of content comes from an extremely small group of highly visible users, making it far easier to combat it than Facebook admits,” the board said, concluding that Facebook is “continuing to profit from hate and deadly disinformation”.

The group has called on Facebook to remove the users from the platform or alter its algorithm to disable engagement with the offending accounts. Facebook did not immediately respond to request for comment, but has stated in the past it has removed more than 18m pieces of Covid misinformation.

Congress has also taken note of the spread of vaccine misinformation on Facebook and other platforms, with the Democratic senator Amy Klobuchar introducing a bill that would target platforms whose algorithms promotes health misinformation related to an “existing public health emergency”.

The bill, called the Health Misinformation Act, would remove protections provided by the internet law Section 230, which prevent platforms from being sued over content posted by their users in such cases.

“For far too long, online platforms have not done enough to protect the health of Americans,” Klobuchar said in a statement on the bill. “These are some of the biggest, richest companies in the world, and they must do more to prevent the spread of deadly vaccine misinformation.”

No comments:

Post a Comment