By ALAN SUDERMAN and JOSHUA GOODMAN
In this Jan. 6, 2021 file photo insurrectionists loyal to President Donald Trump try to open a door of the U.S. Capitol as they riot in Washington. New internal documents provided by former Facebook employee-turned-whistleblower Frances Haugen provide a rare glimpse into how the company, after years under the microscope for the policing of its platform, appears to have simply stumbled into the Jan. 6 riot
U.S. Capitol Police push back rioters who were trying to enter the U.S. Capitol on on Jan. 6, 2021, in Washington. (AP Photo/Jose Luis Magana, File)
WASHINGTON (AP) — As supporters of Donald Trump stormed the U.S. Capitol on Jan. 6th, battling police and forcing lawmakers into hiding, an insurrection of a different kind was taking place inside the world’s largest social media company.
Thousands of miles away, in California, Facebook engineers were racing to tweak internal controls to slow the spread of misinformation and inciteful content. Emergency actions — some of which were rolled back after the 2020 election — included banning Trump, freezing comments in groups with a record for hate speech, filtering out the “Stop the Steal” rallying cry and empowering content moderators to act more assertively by labeling the U.S. a “Temporary High Risk Location” for political violence.
At the same time, frustration inside Facebook erupted over what some saw as the company’s halting and often reversed response to rising extremism in the U.S.
“Haven’t we had enough time to figure out how to manage discourse without enabling violence?” one employee wrote on an internal message board at the height of the Jan. 6 turmoil. “We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.”
It’s a question that still hangs over the company today, as Congress and regulators investigate Facebook’s part in the Jan. 6 riots.
New internal documents provided by former Facebook employee-turned-whistleblower Frances Haugen provide a rare glimpse into how the company appears to have simply stumbled into the Jan. 6 riot. It quickly became clear that even after years under the microscope for insufficiently policing its platform, the social network had missed how riot participants spent weeks vowing — on Facebook itself — to stop Congress from certifying Joe Biden’s election victory.
The documents also appear to bolster Haugen’s claim that Facebook put its growth and profits ahead of public safety, opening the clearest window yet into how Facebook’s conflicting impulses — to safeguard its business and protect democracy — clashed in the days and weeks leading up to the attempted Jan. 6 coup.
This story is based in part on disclosures Haugen made to the Securities and Exchange Commission and provided to Congress in redacted form by Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including The Associated Press.
What Facebook called “Break the Glass” emergency measures put in place on Jan. 6 were essentially a toolkit of options designed to stem the spread of dangerous or violent content that the social network had first used in the run-up to the bitter 2020 election. As many as 22 of those measures were rolled back at some point after the election, according to an internal spreadsheet analyzing the company’s response.
“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” Haugen said in an interview with “60 Minutes.”
An internal Facebook report following Jan. 6, previously reported by BuzzFeed, faulted the company for having a “piecemeal” approach to the rapid growth of “Stop the Steal” pages, related misinformation sources, and violent and inciteful comments.
Facebook says the situation is more nuanced and that it carefully calibrates its controls to react quickly to spikes in hateful and violent content, as it did on Jan 6. The company said it’s not responsible for the actions of the rioters and that having stricter controls in place prior to that day wouldn’t have helped.
Facebook’s decisions to phase certain safety measures in or out took into account signals from the Facebook platform as well as information from law enforcement, said spokeswoman Dani Lever. “When those signals changed, so did the measures.”
Lever said some of the measures stayed in place well into February and others remain active today.
Some employees were unhappy with Facebook’s managing of problematic content even before the Jan. 6 riots. One employee who departed the company in 2020 left a long note charging that promising new tools, backed by strong research, were being constrained by Facebook for “fears of public and policy stakeholder responses” (translation: concerns about negative reactions from Trump allies and investors).
“Similarly (though even more concerning), I’ve seen already built & functioning safeguards being rolled back for the same reasons,” wrote the employee, whose name is blacked out.
Research conducted by Facebook well before the 2020 campaign left little doubt that its algorithm could pose a serious danger of spreading misinformation and potentially radicalizing users.
One 2019 study, entitled “Carol’s Journey to QAnon—A Test User Study of Misinfo & Polarization Risks Encountered through Recommendation Systems,” described results of an experiment conducted with a test account established to reflect the views of a prototypical “strong conservative” — but not extremist — 41-year North Carolina woman. This test account, using the fake name Carol Smith, indicated a preference for mainstream news sources like Fox News, followed humor groups that mocked liberals, embraced Christianity and was a fan of Melania Trump.
Within a single day, page recommendations for this account generated by Facebook itself had evolved to a “quite troubling, polarizing state,” the study found. By day 2, the algorithm was recommending more extremist content, including a QAnon-linked group, which the fake user didn’t join because she wasn’t innately drawn to conspiracy theories.
A week later the test subject’s feed featured “a barrage of extreme, conspiratorial and graphic content,” including posts reviving the false Obama birther lie and linking the Clintons to the murder of a former Arkansas state senator. Much of the content was pushed by dubious groups run from abroad or by administrators with a track record for violating Facebook’s rules on bot activity.
Those results led the researcher, whose name was redacted by the whistleblower, to recommend safety measures running from removing content with known conspiracy references and disabling “top contributor” badges for misinformation commenters to lowering the threshold number of followers required before Facebook verifies a page administrator’s identity.
Among the other Facebook employees who read the research the response was almost universally supportive.
“Hey! This is such a thorough and well-outlined (and disturbing) study,” one user wrote, their name blacked out by the whistleblower. “Do you know of any concrete changes that came out of this?”
Facebook said the study was an one of many examples of its commitment to continually studying and improving its platform.
Another study turned over to congressional investigators, titled “Understanding the Dangers of Harmful Topic Communities,” discussed how like-minded individuals embracing a borderline topic or identity can form “echo chambers” for misinformation that normalizes harmful attitudes, spurs radicalization and can even provide a justification for violence.
Examples of such harmful communities include QAnon and, hate groups promoting theories of a race war.
“The risk of offline violence or harm becomes more likely when like-minded individuals come together and support one another to act,” the study concludes.
Charging documents filed by federal prosecutors against those alleged to have stormed the Capitol have examples of such like-minded people coming together.
Prosecutors say a reputed leader in the Oath Keepers militia group used Facebook to discuss forming an “alliance” and coordinating plans with another extremist group, the Proud Boys, ahead of the riot at the Capitol.
“We have decided to work together and shut this s—t down,” Kelly Meggs, described by authorities as the leader of the Florida chapter of the Oath Keepers, wrote on Facebook, according to court records.
The whistleblower alleged that top officials shot down concerns regarding misinformation and hate speech, saying it was only 'a flash in the pan'
The former employee also lamented Facebook's failure to curb hate speech in Myanmar, which propagated the 2017 ethnic killings of the Rohingya
'I, working for Facebook, had been a party to genocide,' the whistleblower wrote
The allegations come after whistleblower Frances Haugen's bombshell testimony before congress on Facebook's failings
By RONNY REYES FOR DAILYMAIL.COM
PUBLISHED: 22 October 2021
A new whistleblower affidavit submitted by a former Facebook employee accuses the social media giant of prioritizing profits over their due diligence to combat hate speech, misinformation and other threats to the public.
The new allegations, submitted anonymously under penalty of perjury, echoed the claims made by fellow whistleblower Frances Haugen, who delivered a scathing testimony before Congress earlier this month on Facebook's moral failings.
In the most dramatic line of the affidavit, the former employee anguished over Facebook's inability to act quickly to help curb racial killings in Myanmar in 2017 as military officials used the site to spread hate speech.
'I, working for Facebook, bad been a party to genocide,' the whistleblower wrote.
The new allegations come after whistleblower Frances Haugen, pictured, testified before Congress earlier this month over Facebook's failings
The anonymous whistleblower accused one of Facebook's top communication officials, Tucker Bounds, pictured, of brushing aside an employee's concern of misinformation
Building on Haugen's statements, the anonymous whistleblower, who worked on Facebook's Integrity Team, also shared a story about a top company official brushing aside worries of election interference, the Washington Post reports.
As the company was looking to quell political controversy following Russian interference in the 2016 presidential election, Facebook communications official Tucker Bounds allegedly said, 'It will be a flash in the pan.'
'Some legislators will get pissy. And then in a few weeks they will move onto something else. Meanwhile we are printing money in the basement, and we are fine.'
The whistleblower explained that Bounds' alleged viewpoint was common among the leadership in the company, and other former employees have said that Facebook CEO Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg were well aware of the problems.
Bounds has denied the claim and both he and Facebook chastised the Washington Post's reporting of the affidavit.
'It sets a dangerous precedent to hang an entire story on a single source making a wide range of claims without any apparent corroboration,' Facebook spokeswoman Erin McPike said in a statement.
+6
Former employees said Facebook CEO Mark Zuckerberg, and COO Sheryl Sandberg were aware of Facebook's failings but did little to fix the situation
Zuckerberg, shown testifying before congress in 2019, has defended his company despite leaked documents and testimony from former employees detailing the company's wrongs
This is the first time the company has faced such accusations since the internal memos released and testimony given by Haugen.
Haugen and the new whistleblower also submitted the allegations to the Securities and Exchange Commission, which oversees all publicly traded companies.
In the SEC affidavit, the anonymous ex-employee alleges that Facebook officials routinely undermined efforts within the company to fight misinformation and hate speech out of fear of angering then-President Donald Trump and his allies.
The former employee said that on one occasion, Facebook's Public Policy team defended a 'white list' that exempted the alt-right media company Breitbart News and other Trump-aligned publishers from Facebook's ordinary rules against spreading fake news.
When an employee questioned this policy, Joel Kaplan, Facebook's vice president of global policy, reportedly shot down any concerns.
'Do you want to start a fight with Steven Bannon,' Kaplan allegedly said.
Kaplan, who had been previously criticized by former employees for allegedly seeking to protect conservative interests on Facebook, denies he ever showed bias.
'There has never been a whitelist that exempts publishers, including Breitbart, from Facebook's rules against misinformation.'
The whistleblower also complained that Facebook had not been aggressive enough when it came to military officials in Myanmar using the platform to spread hate speech during the mass killings of the Rohingya ethnic group.
Although Facebook had previously acknowledged its failure to act swiftly in the mass deaths of the Rohingya people, the company said it no longer makes such mistakes.
'Facebook's approach in Myanmar today is fundamentally different from what it was in 2017, and allegations that we have not invested in safety and security in the country are wrong,' McPike said in a statement.
The whistleblower went on to accuse Facebook of failing to properly police the secret groups set up on the site.
The former employee said the secret group enable 'terrifying and aberrant behaviors' and are poorly monitored. They go on to say that Facebook Groups have become havens for crime.
These are claims echoed by Gretchen Peters, of the Alliance to Counter Crime Online, who has filed a series of complaints against Facebook for its alleged failings since 2017.
Those failing include permitting terrorist content, drug sales, allowing hate speech and misinformation to flourish, all while inadequately warning investors about any potential risks.
'Zuckerberg and other Facebook executives repeatedly claimed high rates of success in restricting illicit and toxic content — to lawmakers, regulators and investors — when in fact they knew the firm could not remove this content and remain profitable,' Peters said in a new complaint filed on Friday.
Haugen (pictured testifying in Congress on October 5), who claims Facebook puts 'profits before people,' earlier this month released tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company's civic integrity unit
Facebook whistleblower says Zuckerberg puts profits before people
Facebook Whistleblower Frances Haugen's testimony to Congress
During a Senate Commerce subcommittee hearing on October 5, Whistleblower Frances Haugen called for transparency about how Facebook entices its users to keep scrolling on its apps, and the harmful effect it can have on users.
'As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable,' said Haugen, a former product manager on Facebook's civic misinformation team. She left the nearly $1 trillion company with tens of thousands of confidential documents.
'The company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they have put their astronomical profits before people. Congressional action is needed,' Haugen said.
Haugen revealed she was the person who provided documents used in a Wall Street Journal and a Senate hearing on Instagram's harm to teenage girls. She compared the social media services to addictive substances like tobacco and opioids.
Before the hearing, she appeared on CBS television program '60 Minutes,' revealing her identity as the whistleblower who provided the documents.
'There were conflicts of interest between what was good for the public and what was good for Facebook,' she said during the interview. 'And Facebook over and over again chose to optimize for its own interests like making more money.'
Haugen, who previously worked at Google and Pinterest, said Facebook has lied to the public about the progress it made to clamp down on hate speech and misinformation on its platform.
She added that Facebook was used to help organize the Capitol riot on January 6, after the company turned off safety systems following the U.S. presidential elections.
While she believed no one at Facebook was 'malevolent,' she said the company had misaligned incentives.
In response to Haugen's bombshell comments, a Facebook executive accused her of stealing company documents and claimed she is 'not an expert' on the company's content algorithms.
Facebook Vice President of Content Policy Monika Bickert spoke out in an interview with Fox News on, slamming Haugen a day after she testified to Congress.
Bickert said that Haugen 'mischaracterized' the internal studies regarding the harmful impacts of content on Facebook, Instagram and WhatsApp, which she presented to to Congress.
The complaints come after the Haugen's testimony before Congress in early October, where she claimed Facebook promoted divisiveness as a way to keep people on the site, with Haugen saying the documents showed the company had failed to protect young users.
It also showed that the company knew Instagram harmed young girls' body image and even tried to brainstorm ways to appeal to toddlers by 'exploring playdates as a growth lever.'
'The company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they have put their astronomical profits before people. Congressional action is needed,' Haugen said at a hearing.
Haugen, who anonymously filed eight complaints about her former employer with the US Securities and Exchange Commission, told 60 Minutes earlier this month: 'Facebook, over and over again, has shown it chooses profit over safety.'
She claimed that a 2018 change prioritizing divisive posts, which made Facebook users argue, was found to boost user engagement.
That in turn helped bosses sell more online ads that have seen the social media giant's value pass $1 trillion.
'You are forcing us to take positions that we don't like, that we know are bad for society. We know if we don't take those positions, we won't win in the marketplace of social media,' Haugen said.
She also blamed Facebook for spurring the January 6 Capitol riot.
Meanwhile, the senator leading a probe of Facebook's Instagram and its impact on young people is asking Zuckerberg to testify before the panel that has heard far-reaching criticisms from a former employee of the company.
Sen. Richard Blumenthal, D-Conn., who heads the Senate Commerce subcommittee on consumer protection, called in a sharply worded letter Wednesday for the Facebook founder to testify on Instagram's effects on children.
'Parents across America are deeply disturbed by ongoing reports that Facebook knows that Instagram can cause destructive and lasting harms to many teens and children, especially to their mental health and wellbeing,' Blumenthal said in the letter addressed to Zuckerberg.
'Those parents, and the twenty million teens that use your app, have a right to know the truth about the safety of Instagram.'
In the wake of Haugen's testimony early this month, Blumenthal told Zuckerberg, 'Facebook representatives, including yourself, have doubled down on evasive answers, keeping hidden several reports on teen health, offering noncommittal and vague plans for action at an unspecified time down the road, and even turning to personal attacks on Ms. Haugen.'
Blumenthal did offer, however, that either Zuckerberg or the head of Instagram, Adam Mosseri, could appear before his committee.
'It is urgent and necessary for you or Mr. Adam Mosseri to testify to set the record straight and provide members of Congress and parents with a plan on how you are going to protect our kids,' he told Zuckerberg.
A spokesman for Facebook, based in Menlo Park, California, confirmed receipt of Blumenthal's letter but declined any comment.
Haugen, who buttressed her statements with tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company's civic integrity unit, accused Facebook of prioritizing profit over safety and being dishonest in its public fight against hate and misinformation.
'In the end, the buck stops with Mark,' Haugen said in her testimony. 'There is no one currently holding Mark accountable but himself.'
htowey@insider.com (Hannah Towey)
Facebook whistleblower Frances Haugen is backed by the billionaire founder of eBay, Politico reported.
Pierre Omidyar is a well-known critic of Big Tech, donating $150,000 to Whistleblower Aid last year.
From PR to legal aid, here's how Haugen's expensive fight against Facebook is reportedly funded.
Fighting one of the most powerful companies in the world can be expensive, but Facebook whistleblower Frances Haugen has the billionaire founder of eBay in her corner, according to a Wednesday Politico report.
His name is Pierre Omidyar, a well-known critic of Big Tech whose previous advocacy efforts have supported independent journalism in Hawaii, anti-monopoly campaigns, and employee activism.
Omidyar's foundation donated $150,000 to Whistleblower Aid last year - the same nonprofit responsible for Haugen's legal representation.
Omidyar's philanthropic organization, Luminate, is also providing Haugen's PR operations in Europe, Politico's Emily Birnbaum wrote. Bill Burton, Haugen's top PR representative in the US, comes from the Center for Humane Technology, a group that Omidyar also funds.
Neither Haugen nor Omidyar has responded to claims of financial support detailed in Politico's report, and were not immediately available to comment when contacted by Insider.
Compared to previous tech whistleblowers, Haugen has been able to capture the nation's attention due in part to advanced press relations, starting with The Wall Street Journal's Facebook Files and her "60 Minutes" appearance.
Sophie Zhang, another Facebook whistleblower who first spoke out about issues at the company in April 2021, has received less public attention.
After Haugen denounced Facebook for putting "profits before people" during her testimony to congress, Zhang tweeted, "If Congress wishes for me to testify, I will fulfill my civic duty, as I've publicly stated for the past half year."
In an interview with Insider, Zhang said although she and Haugen saw completely different sides of the company, Haugen's testimony felt familiar.
"There is basically no overlap between any of our details. What overlaps is our overall message," she said.Following Politico's report, The Omidyar Network posted a blog titled "In Support of Tech Whistleblowers Who are Holding Tech to Account," the author of which is not listed.
The blog post said the foundation supports Pinterest whistleblower Ifeoma Ozoma and her project, The Tech Worker Handbook, a resource guide for employees considering speaking out about workplace issues "in the public interest."
"We are grateful to the brave people who have called out Big Tech for its bad behavior. They are an important part of creating systemic checks and balances for Big Tech," the post continues. "Because of them, policymakers are taking notice and taking action to rein in their excessive power and restore trust and balance in digital markets."
Read the original article on Business Insider