Saturday, August 15, 2020



Facebook’s Preferential Treatment Of US Conservatives Puts Its Fact-Checking Program In Danger
Facebook’s employees and fact-checking partners say they are left in the dark about how the company decides what content stays up and what comes down.

Posted on August 13, 2020

Ben Kothe / BuzzFeed News; Getty Images

On May 8, Prager University, a nonprofit conservative media outlet, published a video on Facebook that incorrectly claimed “there is no evidence that CO2 emissions are the dominant factor” in climate change. Within days, Climate Feedback, a nonpartisan network of scientists and a member of Facebook’s global fact-checking partnership, rated the content as false — a designation that was supposed to result in serious consequences.

It was PragerU’s second strike for false content that month, which under Facebook’s own policies should have triggered “repeat offender” penalties including the revocation of advertising privileges and the specter of possible deletion.

But it didn't. As first reported by BuzzFeed News last week, a Facebook employee intervened on PragerU’s behalf and asked for a reexamination of the judgment, citing “partner sensitivity” and the amount of money the organization had spent on ads. Eventually, while the false labels on PragerU’s posts remained, Facebook disappeared the strikes from its internal record and no one — not the public, the fact-checkers, or Facebook’s own employees — was informed of the decision.

Meanwhile, PragerU cashed in on the fact-checks of its climate misinformation. On May 19 — the day after it received its first strike for false content — it launched a fundraiser. “Facebook is using biased 3rd party fact-checkers to flag content and censor conservatives," the organization told its more than 4.2 million followers. "Is Facebook now the arbiter of truth?”

The campaign raised $21,637.

Since at least late 2016, CEO Mark Zuckerberg has defended Facebook by insisting it should not be “an arbiter of truth,” while creating a third-party fact-checking program to fill that role of umpire. But journalists and researchers at the dozens of organizations that make up Facebook’s fact-checking operation say the company is often just that. Some told BuzzFeed News they were surprised to learn their verdicts had been ignored or overruled by Facebook in a closed-door process with little transparency, and warned that this risks undermining the program’s credibility.

“They are the arbiters of the consequences for publishing false or misleading information,” said one fact-checker who asked not to be named for fear of repercussions from Facebook.

“If people want to evade consequences, it’s easy to do it,” they added.

“They are the arbiters of the consequences for publishing false or misleading information.”

Some employees at the social network agreed. Last week, after BuzzFeed News revealed Facebook executives and staff were intervening in fact-check disputes on behalf of right-wing publishers, workers wondered if the company was caving to loud critics and political pressure.

“Mark likes to say how Facebook should not be ‘the arbiter of truth,’” one person wrote in an internal Facebook employee group. “But escalations... that are focused on partners who are ‘sensitive’ sound exactly like us meddling in this area. How do we reconcile this contradiction?”

Evelyn Douek, a lecturer at Harvard Law School, said that even though Facebook doesn’t want to be in the business of declaring what is true and false, it still makes a lot of choices in how it structures its policies and fact-checking program that leave it “in the driver seat.”

“There will be a pretty big reckoning around fact-checking,” she said. “People don’t really understand it either and they see it as a panacea for problems on social media platforms.”
Do you work at Facebook or another technology company? We'd love to hear from you. Reach out at craig.silverman@buzzfeed.com, ryan.mac@buzzfeed.com, or via one of our tip line channels.

Facebook did not answer a list of specific questions related to the preferential treatment of right-wing pages that had received misinformation strikes. Previously the company had said that while it “defer[s] to third-party fact-checkers on the rating that a piece of content receives” only it is “responsible for how we manage our internal system for repeat offenders.”

"Facebook is the only company to partner with more than 70 fact-checking organizations worldwide to fight the spread of viral misinformation,” Facebook spokesperson Liz Bourgeois said in a statement for this story. “There is no playbook for a program like ours and we’re constantly working to improve it based on feedback from our partners and what we see on our platform.”






Jack Gruber / USA TODAY NETWORK


“The Arbiter of Truth”

Four days after the 2016 US presidential election, Zuckerberg posted a mea culpa to Facebook addressing mounting evidence that misinformation spread and amplified on the platform might have influenced its result. In the post, the Facebook CEO — who had previously said misinformation on the social network influencing the election was a "pretty crazy idea" — acknowledged hoaxes and fake news did exist on the social network.

“This is an area where I believe we must proceed very carefully though. Identifying the ‘truth’ is complicated,” he wrote. “While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. … I believe we must be extremely cautious about becoming arbiters of truth ourselves.”

Facebook launched a third-party fact-checking program a month after Zuckerberg’s post, and its CEO’s sentiment has become a common refrain from the social network’s executives. Debates on whether Facebook is a media company that’s responsible for the content on its platform have shifted to whether or not the company should play a role in identifying — and subsequently removing — misinformation.

Now, as the US heads toward another contentious presidential election, Zuckerberg is again rolling out the “arbiter of truth” talking point. "I just believe strongly that Facebook shouldn't be the arbiter of truth of everything that people say online," he told Fox News in May. “Private companies probably shouldn't be, especially these platform companies, shouldn't be in the position of doing that."

"They’re saying, ‘We don’t want to be the arbiters of truth,’ but they do want to be the arbiter of whether pages are taken down or not."

But internal discussions and documents recently obtained by BuzzFeed News and NBC News revealed that Facebook has been doing just that, deciding whether or not to penalize pages after they’ve received strikes for misinformation and, at times, casting aside judgments made by its fact-checking partners seemingly for fear of political blowback or lost revenue. Unlike fact-checkers, Facebook does not disclose the reasons and evidence it uses to make decisions to remove misinformation strikes or otherwise choose to overrule its fact-checking partners.

“The problem with what they’re doing is they’re saying, ‘We don’t want to be the arbiters of truth,’ but they do want to be the arbiter of whether pages are taken down or not,” said Peter Cunliffe-Jones, the founder of Africa Check, who now serves as an adviser to the International Fact-Checking Network. “And there is no transparency about that decision making process.”

Cunliffe-Jones suggested Facebook take a page from its fact-checkers and publish its rationale for removing misinformation strikes and other reversals, lest it undermines trust in the entire process.

“I do understand that even Facebook can’t provide granular detail on every decision made across millions of decisions, but on the significant ones they can,” he said. “I think it would help to develop trust in the process. And if the process [at Facebook] is not trustworthy, that would expose it.”

The lack of transparency to which Cunliffe-Jones referred has roiled the social network’s own employees. Following BuzzFeed News’ story that revealed preferential treatment for conservative pages and personal interventions from executives including vice president of global public policy Joel Kaplan, an internal message board called “Let’s Fix Facebook (the company),” which has about 10,000 members, exploded in conversation.

“The silence on this is unacceptable. Both the pipeline intervention and the firing,” wrote one employee, referring to the removals of misinformation strikes and the dismissal of an engineer who gathered and revealed evidence of it.

“Can we also use this as an opportunity to be more transparent about fact-checking in general?” asked another employee. “Not just internally but also to our users? How escalations and appeals work, who can do them, who is doing them, aggregated statistics about posts labeled...”

“If someone at Facebook is trying to help us, they’re not doing a very good job.”

The lack of transparency has also upset those being penalized for misinformation rulings, including the executives at PragerU. Craig Strazzeri, PragerU’s chief marketing officer, said his organization appealed the fact-check decisions with Climate Feedback in May but was not aware of any Facebook employees working on their behalf to get the strikes removed.

“If someone at Facebook is trying to help us, they’re not doing a very good job,” Strazzeri said, noting that PragerU’s organic reach was cut following the two false fact-checks. PragerU was still able to run ads during that time period, according to Strazzeri, who noted that the company’s page is once again under scrutiny after two more strikes for misinformation since May.

Emmanuel Vincent, the director of Science Feedback, the not-for-profit organization that runs Climate Feedback, said PragerU only disputed one of its two May verdicts, which it upheld as false. He's concerned that PragerU has found a way to use fact-checks to its advantage.

"Each time they get a strike, they can raise money,” he said.


Mark Zuckerberg is on a screen on a videoconferencing call set up in the Capitol while lawmakers sitting in the background wear facemasks

Bloomberg / Getty Images
Mark Zuckerberg speaks during a House Judiciary Subcommittee hearing in Washington, DC, July 29.


Repeat Offenders

Facebook's level of attention to concerns from American conservatives on its platform is unlike any other constituency in the world, according to the fact-checkers who spoke to BuzzFeed News. Some worry that’s detrimental to international efforts to police the platform for misinformation. Cunliffe-Jones told BuzzFeed News Facebook has failed to act against repeat offender financial scam pages in countries like Kenya.

“It’s been clear from quite early on that there’s a problem with the way that some pages that are repeat offenders seem to keep on operating,” Cunliffe-Jones said. “I’ve noted this from pages that have been repeatedly fact-checked [in Kenya]. With these pages, I don’t think there’s a political game. I think it’s a lack of attention — and that’s also harmful.”

Other international fact-checkers said Facebook’s emphasis on the US election cause it to be less responsive to issues in other parts of the world. One journalist at a fact-checking outlet in a G7 country said Facebook’s lack of local staff causes the company to at times ask its checking partners to help with decisions about how to apply company policy.

“They're not as well-staffed in some of the other markets outside the US. So they actually do lean on the fact-checkers a little bit more about some of the policy things,” said the journalist.


Bourgeois said Facebook values the expertise of local checkers. “We frequently speak to fact-checkers in different regions to understand how misinformation is spreading in their countries and languages. This is not in lieu of relying on the expertise of our own teams, it’s in addition.”

"...there’s not much bandwidth for anything outside the US at the moment at Facebook."

Agence France-Presse has fact-checkers in more than 30 countries, making it Facebook’s biggest partner in the program. Phil Chetwynd, the global news director of AFP, said his organization has not felt pressure from Facebook about its ratings, though anything having to do with the US is given a different level of scrutiny by the company.

“You can feel there’s not much bandwidth for anything outside the US at the moment at Facebook,” he said. “There’s just a tremendous concentration of resources on the US.”

Chetwynd said one of the values of participating in the fact-checking program for organizations like AFP is knowing that repeat offenders face consequences. If Facebook fails to enforce its policies, it reduces the value of fact-checkers. And the lack of transparency around the internal decision-making process for removing misinformation strikes only makes things worse.

“Most fact-checkers have developed a very constructive relationship with Facebook, but this issue around transparency has always been a point of frustration,” he said. “It is something we have voiced to them many times both privately and publicly — we do not get nearly enough precise data on the impact of our fact-checks on the platform. Our feeling is that making this data public would only add to the credibility of our work and the fact-checking program in general.”


The Opinion Exemption

Over the past two months, Facebook and its fact-checking partners have been discussing one of the more contentious issues in their partnership: the so-called opinion exemption. Facebook’s policy is that opinion articles, like statements from political leaders, are exempt from review.

Internal documents obtained by BuzzFeed News show that Facebook employees have cited this policy as justification for removing misinformation strikes from PragerU and pro-Trump activists Diamond and Silk after fact-checkers issued them. A Facebook partner manager for PragerU said the false claims the publication made about the climate crisis should fall under the “opinion loophole.”

On Tuesday, Facebook seemingly closed this loophole explaining in a policy update that if “content is presented as opinion but is based on underlying false information - even if it’s an op-ed or editorial - it's still eligible to be fact-checked.”

"It’s not the best thing for a fact-checker to learn their flag has been removed by Facebook from the media.”


Baybars Örsek, director of the International Fact-Checking Network, said the policy update will help ensure that misinformation can’t be embedded in op-ed pieces. But he noted that Facebook can still decide whether a piece of fact-checked content is opinion or not. The company can remove misinformation strikes without informing a fact-checker of its decision. That lack of transparency is a problem.

“At the end of the day, it’s not the best thing for a fact-checker to learn their flag has been removed by Facebook from the media,” he said.

Other fact-checkers echoed Örsek’s concerns. One said they wished Facebook had to publicly disclose when it removes a misinformation strike. Such an obligation would make it difficult for the company to give preferential treatment to “the partners who pay a lot of money to Facebook, or the partners who could say something and be heard at the very high levels of the company.”

That appeared to be the case with PragerU, which has managed to avoid severe punishment — and monetize it.

PragerU has raised at least $400,000 from Facebook fundraisers protesting alleged censorship on Facebook and YouTube since late 2018, according to a review by BuzzFeed News. On Aug. 5, it launched a fundraiser on Facebook in response to the company’s removal of a PragerU post featuring a now-infamous video initially circulated by Breitbart News of a group of doctors making false and potentially harmful claims about COVID-19.

It raised $66,844. ●


MORE ON FACEBOOK
“Facebook Is Hurting People At Scale”: Mark Zuckerberg’s Employees Reckon With The Social Network They’ve Built

Ryan Mac · Aug. 5, 2020



Craig Silverman is a media editor for BuzzFeed News and is based in Toronto.

Ryan Mac is a senior tech reporter for BuzzFeed News and is based in San Francisco.


No comments:

Post a Comment