Facebook AI catches 95% of hate speech; company still wants mods back in office
Facebook wants butts in seats to enforce rules. Workers want not to get COVID.
KATE COX - 11/19/2020
Facebook's Menlo Park, California, headquarters as seen in 2017.
Facebook's software systems get ever better at detecting and blocking hate speech on both the Facebook and Instagram platforms, the company boasted today—but the hardest work still has to be done by people, and many of those people warn that the world's biggest social media company is putting them in unsafe working conditions.
About 95 percent of hate speech on Facebook gets caught by algorithms before anyone can report it, Facebook said in its latest community-standards enforcement report. The remaining 5 percent of the roughly 22 million flagged posts in the past quarter were reported by users.
That report is also tracking a new hate-speech metric: prevalence. Basically, to measure prevalence, Facebook takes a sample of content and then looks for how often the thing they're measuring—in this case, hate speech—gets seen as a percentage of viewed content. Between July and September of this year, the figure was between 0.10 percent and 0.11 percent, or about 10-11 views of every 10,000.
Facebook also stressed—in both its news release and in a call with press—that while its in-house AI is making strides in several categories of content enforcement, COVID-19 is having a continued effect on its ability to moderate content.
"While the COVID-19 pandemic continues to disrupt our content-review workforce, we are seeing some enforcement metrics return to pre-pandemic levels," the company said. "Even with a reduced review capacity, we still prioritize the most sensitive content for people to review, which includes areas like suicide and self-injury and child nudity."Advertisement
Secondhand workforce
The reviewers are critical, Facebook Vice President of Integrity Guy Rosen told press in a call. "People are an important part of the equation for content enforcement," he said. "These are incredibly important workers who do an incredibly important part of the job."
Full-time Facebook employees who are employed by the company itself are being told to work from home until July 2021 or perhaps even permanently.
In the call with reporters, Rosen stressed that Facebook employees who are required to come in to work physically, such as those who manage essential functions in data centers, are being brought in with strict safety precautions and personal protective equipment, such as hand sanitizer, made available.
Moderation, Rosen said, is one of those jobs that can't always be done at home. Some content is simply too sensitive to review outside of a dedicated workspace where other family members might see it, he explained, saying that some Facebook content moderators are being brought back into offices "to ensure we can have that balance of people and AI working on those areas" that need human judgement applied.
The majority of Facebook's content moderators, however, do not work for Facebook. They work for third-party contract firms worldwide, often with woefully insufficient support to do their jobs. Reporters from The Guardian, The Verge, The Washington Post, and BuzzFeed News, among others, have spoken to these contract workers around the world, who describe relentless expectations and widespread trauma at work. Earlier this year, Facebook agreed to a $52 million settlement in a class-action suit filed by former content moderators who alleged the job gave them "debilitating" post-traumatic stress disorder.Advertisement
FURTHER READINGFacebook bans QAnon entirely, says previous crackdown wasn’t enough
All of that was before COVID-19 spread around the world. In the face of the pandemic, the situation looks even worse. More than 200 moderators who are being told to go back into the office signed on to an open letter accusing Facebook of "needlessly risking moderators' lives" without even providing hazard pay for workers who are being ordered back into the office.
"Now, on top of work that is psychologically toxic, holding onto the job means walking into a hot zone," the letter reads. "In several offices, multiple COVID cases have occurred on the floor. Workers have asked Facebook leadership, and the leadership of your outsourcing firms like Accenture and CPL, to take urgent steps to protect us and value our work. You refused. We are publishing this letter because we are left with no choice."
"This raises a stark question," the letter adds. "If our work is so core to Facebook's business that you will ask us to risk our lives in the name of Facebook's community—and profit—are we not, in fact, the heart of your company?"
Scrutiny grows
Meanwhile, state and federal scrutiny of Facebook only keeps growing. This week, company CEO Mark Zuckerberg testified before the Senate for the second time in just three weeks. Members of the House are also complaining that Facebook has failed to moderate content properly or safely amid rampant election-related disinformation.
Other regulatory bodies are likely coming for Facebook—and soon. Many of the antitrust investigations that began in 2019 are drawing to a conclusion, according to media reports. The Federal Trade Commission is reportedly planning to file a suit within the next two weeks, and a coalition of nearly 40 states, led by New York Attorney General Letitia James, is likely to follow in December. Those suits are likely to argue that Facebook unfairly stifles competition through its acquisition and data strategies, and it may end up trying to force the company to divest Instagram and WhatsApp.
ARS TECHNICA
KATE COXKate covers tech policy issues, including privacy, antitrust, and other shenanigans, from Washington, DC.
No comments:
Post a Comment