It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Monday, October 25, 2021
Desperate Algerians ready to die at sea to reach Spain
The number of Algerians arriving on Spain's southeastern coast or the Balearic Isles has soared
JORGE GUERRERO AFP
Issued on: 26/10/2021 -
AlmerÃa (Spain) (AFP)
"I'd rather die at sea than stay in Algeria," says Khaled Dih, his eyes dark, and his Nike trainers soaked and full of sand, after landing on a beach at Almeria following a six-hour crossing from Oran.
It was one of around 50 boats that day which made the dangerous crossing from the Algerian coastline which lies 200 kilometres (120 miles) to the south, officials and NGOs said.
"There is nothing there, no work in the bled," Dih said, using a North African word meaning "the old country", dragging on a cigarette as he adjusted his ponytail.
Dih chose his 21st birthday to leave, following in the footsteps of thousands of others prepared to risk everything, and then torched their identity papers on arrival to avoid being sent back.
Small boats used by migrants to cross sea are stored in an open-air warehouse in Almeria
JORGE GUERRERO AFP
In Arabic, they are known as "harraga" -- or "those who burn".
At least 309 migrants, 13 of them minors, have died in the western Mediterranean since the start of the year, according to the International Organization for Migration.
Dih says the boat was travelling at high speed, bouncing off the water, leaving him stiff and bruised after a night crossing in the freezing cold.
"You couldn't do anything... so I just thought about my parents and my friends," says the amateur boxer and fan of French rappers PNL.
Bruised, exhausted, but alive
Sat outside the bus station, he is shattered. He has not slept for three nights since leaving Annaba, his hometown on the Algerian coast near the Tunisian border.
From there he travelled 900 kilometres (560 miles) west to the port city of Oran and paid 4,500 euros ($5,200) for passage to Spain -- the equivalent of many months' salary.
Police escort newly-arrived migrants to a Centre for Temporary Assistance for Foreigners (CATE) in Almeria
JORGE GUERRERO AFP
Now he's waiting for a bus to Barcelona from where he will try and enter France, like most of the harraga.
"I don't speak Spanish... I have family and friends in France so I can't stay here all on my own," he explains.
The number of Algerians arriving on Spain's southeastern coast or the Balearic Isles has soared in recent months.
A confidential document compiled by the Spanish authorities and seen by AFP, shows 9,664 Algerians have illegally entered Spain since the start of the year, or 20 percent more than last year.
According to Frontex, the agency which protects the EU's external borders, Algerians are the largest group of foreigners entering Spain illegally -- and the third-largest in Europe.
On the Algerian side, 4,704 migrants have been stopped trying to leave in 2021, more than half of them in September, the Algerian Defence Ministry says.
And in a new twist, more and more women and children are risking their lives to make the crossing.
"Harraga families is a new phenomenon," says Said Salhi, deputy head of the Algerian League for the Defence of Human Rights.
Among recent travellers were "mothers and babies, pregnant women and people with disabilities... which shows us something about the extent of despair" in Algeria, he said.
Save the Children said it had taken in more than 100 minors who had arrived in September on the coast of Andalusia, the southern region around Almeria. Family anguish
Back home in Algeria, there is anguish for the families left behind, says Francisco Jose Clemente Martin, 24, who works for CIPIMD, an NGO that tries to trace migrants who have died or gone missing at sea.
Every day, he is in contact with relatives, sometimes having to send them photos of the dead so they can be identified.
A Spanish coast guard boat patrols off the coast of Almeria
JORGE GUERRERO AFP
"The families are desperate for news," he told AFP, saying the calls are often marked by "screams, weeping and desperation. Many mothers end up in hospital from the stress. It's a lot for them to take in".
Ahmed Bensafia, a 28-year-old who arrived in Almeria a year ago from a town near Algiers, said he didn't tell his family he was leaving "so as not to worry them".
He said he had no other choice but to leave because "wages are so low" that even after working all day, "you can't be sure you can afford to eat dinner".
But with hindsight he admits he would advise other youngsters "not to risk their lives" as he did.
Even if they manage to avoid the Spanish police, migrants still face a long journey fraught with hardship before reaching family and friends in France.
In early October, three were killed and one seriously hurt when they were hit by a train near Saint-Jean-de-Luz, southwestern France.
They were resting on the tracks to avoid the police when they were caught unawares by the first train of the day, local officials said.
Two days after landing on the Almeria beach, Khaled Dih managed to reach the Spain's northern border and cross into France.
Rescued from extinction, bison rediscover Romania mountains
The bison had all but been driven out of Europe by hunting and the destruction of its habitats
Daniel MIHAILESCU AFP/File
Issued on: 26/10/2021 -
Armenis (Romania) (AFP)
They are signs of the success of a project to reintroduce bison to this region after a centuries-long absence, key to keeping the hairy giants off lists of critically endangered species.
Bison had all but been driven out of Europe by hunting and the destruction of its habitats, but their reappearance in Romania has brought back a key component of the region's ecosystem.
Under gentle autumn sunshine on the edge of a centuries-old wood, young forest warden Matei Miculescu is on the lookout for members of the Carpathian herd.
The animals can be hard to spot, having been tempted further into the forest by the abundant vegetation and the possibility of extending their habitat.
Miculescu says the animals are thriving in the forest, in contrast to captivity which "creates the risk of inbreeding" and weakens their chances of survival.
Nowadays, around 6,000 bison, Europe's largest mammal and a distant cousin of the American buffalo, can be found on the continent.
Most of them are on the Polish-Belarussian border where efforts to revive the population got underway in the 1950s.
Romania welcomed bison back in 2014 in the southwestern Armenis region, more than 200 years after it was last seen there.
Born in captivity in other parts of Europe -- where they had been given names like Kiwi, Bilbo and Mildred -- they were transferred to Romania in 16 separate stages.
Cutting human links
Thanks to successful reproduction in the wild, "around 105 bison now live freely in the Tarcu mountains and have settled in well," says Marina Druga, head of the project led jointly by the WWF and Rewilding Europe.
"In the past two years, there haven't been any deaths in their ranks," says Druga, explaining that the goal is to get to a population "of 250 individuals in five years' time".
Nowadays, around 6,000 bison, Europe's largest mammal and a distant cousin of the American buffalo, can be found on the continent
Daniel MIHAILESCU AFP/File
The programme is well established: first the animals spend several weeks being re-acclimatised to life in the wild and are only then released and left to fend for themselves.
They can currently be found making use of around 8,000 hectares in a protected area which stretches over 59,000 hectares.
The southern Carpathians present ideal conditions: "a vast region with a thinly spread human population and no intensive agriculture," says Wanda Olech-Piasecka from the International Union for Conservation of Nature (IUCN).
Since 2014 there have been 38 bison calves born in the area.
"Without them, the project would have no future," says Miculescu, who recognises each of the creatures by their horns of the colour of their fur.
But those running the project have resisted giving the calves names.
Since they have been born in the wild, all links with humans should be cut, explains Druga.
Architects of the forest
The WWF says the next step to make the population viable in the long term will be to introduce bison into other parts of the Carpathians and establish a network of populations.
Over the long term, the animals need a large habitat in order avoid conflict over territory with human populations or within their own herds.
Their reappearance in Romania has brought back a key component of the region's ecosystem
Adrian PICLISAN AFP/File
Along with benefitting the bison themselves, advocates say that this example of "rewilding" is also a boon for the wider ecosystem, bringing benefits for some 600 species from microorganisms to large carnivores.
"They change the landscape and architecture of the forest by stopping the spread of invasive tree species, spreading seeds for hundreds of plants and creating paths smaller animals use to access food," explains Druga.
Weaker or sick members of the herd can themselves serve as prey for wolves or bears, who in turn will be less likely to stray into human settlements in search of food, a problem which has grown in recent years in Romania.
Even those who watch them closely have sometimes been surprised by the effects the bison's presence can have.
"Birds collect discarded bits of fur to isolate their nests while frogs can use bison hoof prints to jump from one pond to another," says Miculescu.
Social Democrat Baerbel Bas, shown next to Finance Minister Olaf Scholz, is expected to become the third woman speaker of the Bundestag
John MACDOUGALL AFP
Berlin (AFP)
Angela Merkel will remain caretaker chancellor until a new government is in place while the body that will elect her successor, the Bundestag lower house of parliament, will convene having swollen to a record 736.
The September 26 general election left the centre-left Social Democrats as the biggest party, whose candidate Olaf Scholz is working toward cobbling together a ruling coalition by early December.
But while the top job is expected to pass from Germany's first female chancellor to a man, the Bundestag's powerful speaker Wolfgang Schaeuble is set to hand off the podium to Baerbel Bas -- only the third woman to hold the post.
Meanwhile the new-look Bundestag boasts a number of firsts for the EU's most populous nation, although activists say it still has far to go to truly reflect the rich tapestry of German society.
The first-ever black woman MP, Eritrean-born Awet Tesfaiesus, 47, will take her seat among the Greens' parliamentary group.
Having arrived in Germany at age 10, Tesfaiesus went on to become a lawyer and has devoted her career to defending the rights of immigrants and asylum seekers.
"We need diversity in this country," she told AFP. "We need people who have been victims of racism to be better represented." - 'Latent racism' -
During her mandate, Tesfaiesus said she wants to fight the "label" of foreigner that sticks to her despite her German passport.
"You feel latent racism everywhere," she said.
"When I'm looking for an apartment, when the postman comes into my law office and talks with my secretary because he automatically thinks she's my superior..."
Tesfaiesus told local media she launched her political career as a consequence of a racist attack in Hanau near Frankfurt in February 2020, when a far-right gunman shot dead nine people at a shisha bar and a cafe.
She joins Senegalese-born Karamba Diaby, a Social Democrat who was until now Germany's only black MP, and his party colleague Armand Zorn, who emigrated from Cameroon at the age of 12 and just won his first direct mandate in parliament.
The number of foreign-born deputies or those with at least one parent born abroad has climbed to 83, making up 11.3 percent of the Bundestag, up from 8.2 percent in the last assembly.
"We are helping to wake up the political scene," Deniz Nergiz, head of the Federal Council on Immigration and Integration, which promotes political participation in immigrant communities, told AFP. "There is also for the first time a refugee elected in the (former communist) east of the country," where the number of foreign-born Germans is significantly smaller. Also among the newly elected is Lamya Kaddor, who teaches religion courses about Islam in schools in the Ruhr Valley -- a subject hotly debated in recent years in Germany. At the same time, the Greens, who are expecting to serve as junior partners in the new government, touted at least three dozen of their deputies under the age of 35. MP Ricarda Lang tweeted a photo of them on the steps of the parliament building with the tagline: "There's some new kids in town."
'Far behind'
But despite its broader representation, the Bundestag, with its overwhelmingly white and male makeup, still lags in mirroring the modern face of Germany.
Nergiz said parliament was still "far behind" the 26 percent of people of foreign origin "across German society".
The same is true for women in the ranks of the MPs, even after 16 years with Merkel as the first woman chancellor at the helm. They make up only 24 percent of deputies, up from 20 percent previously.
The gender figures vary widely among the parties with seats in the Bundestag, with the Greens boasting a 59 percent majority of women, including two transgender women: Tessa Ganserer and Nyke Slawik.
By contrast the far-right Alternative for Germany only has 13 percent of women among its ranks.
Damaging Facebook whistleblower disclosures detailed in media reports
Several media reports based on analysis of leaked internal documents include details of how Facebook's apps allegedly enabled curbs on free speech and failed to stop content connected to human trafficking.
Facebook has been hit by a trove of revelations documenting allegations of harm caused by its suite of apps
Haugen testified before a UK Parliamentary committee on Monday, which is examining draft legislation intended to crack down on harmful online content.
In her testimony, Haugen claimed Facebook causes harm by enabling the amplification of online hate speech.
She also said the algorithms pushed users toward political extremes. "So someone center left, they'll be pushed to radical left, someone center right will be pushed to radical right," she said.
Earlier this month, Haugen testified before the US Senate on the harm she believes Facebook causes. She cited internal documents she secretly copied before leaving the company.
What disclosures were made in the reports?
A CNN report outlines allegations that Facebook is aware its apps enable "all three stages of the human exploitation lifecycle [recruitment, facilitation,
exploitation]," but has failed to place proper safeguards to halt human trafficking.
The report alleged Facebook only responded to reputation and commercial pressures when pulling accounts facilitating the trade in human beings.
The Washington Post reported Facebook CEO Mark Zuckerberg personally approved a plan to censor government critics in Vietnam, one of Facebook's most lucrative profit centers in Asia, at the request of the ruling Communist Party.
According to Bloomberg, Facebook has not been fully forthcoming to advertisers about the decline in the number of teenage users on its platforms, as well as a drop in the amount of time young users spend on its suite of apps.
Facebook content related to human trafficking
Included among the vast array of social problems Facebook stands accused of profiting from is the trade and sale of human beings, often for domestic servitude, by human traffickers.
According to internal documents, the company has been aware of the problem since at least 2018. A year later, on the heels of a BBC report documenting the practice, the problem was said to be so severe that Apple threatened to pull the Facebook and Instagram apps from its app store.
Internally, Facebook rushed to remove content related to human trafficking and make emergency policy changes to avoid commercial consequences described as "potentially severe" by the company.
However, an internal company report has detailed that the company did not take action prior to public disclosure and threats from Apple.
"Was this issue known to Facebook before the BBC enquiry and Apple escalation? Yes," the internal company report said.
In a complaint to the US Securities and Exchange Commission (SEC) first reported by The Wall Street Journal last month, whistleblower Haugen wrote: "Investors would have been very interested to learn the truth about Facebook almost losing access to the Apple App Store because of its failure to stop human trafficking on its products."
And yet despite company concerns over risks and commercial consequences, CNN was able to locate multiple Instagram accounts offering the sale of domestic servants last week.
One of those accounts had a name that translates to "Offering domestic workers" with photos and physical descriptions down to body measurements such as weight and height and other very personal information.
As with the BBC report, Facebook only moved to pull the accounts when CNN reached out.
A January 2020 internal report details how Facebook "enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks."
A key source country for women allegedly trafficked on the platform, the Philippines, has not had proper safeguards on Instagram stories to prevent the practice.
Free speech curbed in Vietnam
Facebook CEO Zuckerberg, who tends to present himself as a free speech advocate in the US, personally approved a company plan to censor government critics in Vietnam at the request of the country's ruling Communist Party, The Washington Post reported.
In the past, Zuckerberg has defended morally objectionable posts from Holocaust deniers as an unintentional error done without malice, whereas scholars of genocide and war crimes note denial is often the final stage of genocide.
However, when deciding to censor opposition-minded posts in Vietnam, Facebook said it wanted "to ensure our services remain available for millions of people who rely on them every day." Ahead of the Communist Party Congress in Hanoi, Facebook saw a sharp uptick in the number of censored posts on the site
Ahead of the Communist Party's congress in January 2021, the company's own transparency report shows that more than 2,200 posts by Vietnamese users were censored from July and December 2020. This was in contrast to 834 blocked posts in the period from January to June of 2020.
Zuckerberg has been noted by former employees as being obsessed with metrics, growth and neutralizing competition at all costs. Shareholders left in the dark about decline in young users
According to a leaked report for senior executive Chris Cox, a chief product officer at Facebook, Facebook faces a steep decline in popularity among teenagers and young adults, Bloomberg reported.
Year-on-year, US teenagers spent 16% less time on Facebook and young adults 5% less time.
Additionally, data shows Facebook faces a decline in the number of new users. The data follows a yearslong decline in growth as the company finds people are much less willing to share information than previously on its suite of apps.
Young adults in particular find "content on Facebook as negative, fake and misleading" and would rather be "uplifted and motivated," which is not happening for them on Facebook, the company's own internal report notes.
"If this holds at scale, teens could be creating sharing barriers for upcoming generations," it said.
In her letter to US Securities and Exchange Commission, Haugen wrote, "Facebook's stock valuation is based almost entirely on predictions of future advertising growth."
While Facebook has a market share of 23.7% of the global advertising market, according to Bloomberg, company executives have not been forthcoming about these concerns either publicly or to shareholders.
Facebook has dodged questions from Wall Street analysts about the social networking site's popularity with younger users.
EU antitrust official Vestager: We must act now against Facebook European Competition Commissioner Margrethe Vestager told DW in an interview that legal action against the tech giant could be endless, but regulation now could limit its damage. Vestager said taking Facebook to antitrust court was 'not enough,' and tighter regulations were also needed
European Commissioner for Competition Margrethe Vestager told DW on Tuesday that breaking up Facebook could take "years," and that leaders should act quickly before the platform does any more damage.
Democracies need to 'stand together' against Facebook
Asked if Facebook was too large and powerful to even attempt to force changes from the outside, Vestager said "not for our democracy."
She said democracies standing together can make real changes to how Facebook runs its platforms, and said regulations needed to be tightened up when dealing with companies that could wield "immense influence" over democracy, and over people's mental health.
"If we stand together, then we can make real changes," Vestager said. "And just to make Facebook assess the risk that they could pose to young people's mental health and to have, you know, external people looking into that assessment to see if things have been corrected, that would be an important step forward."
"We also need to regulate much tighter if we are dealing with someone who can have an immense influence both on mental health, but also how our democracy develops," she added, explaining that the EU Commission had prepared legislation that balances the need to preserve freedom of speech but to get rid of things that "are illegal offline" such as incitement to violence. 'No end' to legal battles
Vestager admitted, however, that any legal battle with Facebook would be protracted.
The social media giant "would take us to court for years with no end," she said. But she was hopeful that if the EU can "take action now so that things change," then "smaller businesses can get full access to the market," and social media giants like Facebook would have to take responsibility for the harms they've caused.
The Danish politician's comments came at a time when Facebook is under immense scrutiny. Earlier this month, whistleblower Frances Haugen detailed the ways in which Facebook knew it was being used to promote violence and discord and did nothing to stop it.
Senate to grill execs from TikTok, Snapchat, YouTube on harm to teens, children Sen. Marsha Blackburn, R-Tenn., speaks during a Senate Commerce, Science and Transportation Subcommittee on Consumer Protection, Product Safety and Data Security hearing on children's online safety and mental health on September 30.
Oct. 25 (UPI) -- Top executives from Snapchat, TikTok and YouTube will testify in Congress this week on how their platforms, which have become tremendously popular among young people, can cause harm to children online.
The Senate commerce committee's panel on consumer protection, product safety and data security will convene a hearing on Tuesday titled, "Protecting Kids Online." It was the same committee that put Facebook under the spotlight earlier this month by questioning whistleblower Frances Haugen.
Facebook and other popular social platforms have been under scrutiny recently over their impact on children, due to multiple studies that have concluded that they actively cause harm.
A month ago, Facebook paused the rollout of its planned Instagram Kids -- a child version of its popular photo-sharing app -- due to such studies and a desire to "work with parents, experts, policymakers and regulators to listen to their concerns."
On Oct. 5, Haugen, a former data scientist at Facebook, appeared before the committee and told the panel that the company has long known about the platform's negative influence on children.
Haugen was scheduled to testify in British Parliament on Monday to help lawmakers scrutinize proposed legislation to rein in the power of social media companies and crack down on those that fail to protect younger users.
Similar platforms like TikTok, Snapchat and YouTube are also popular with kids and teens.
"While social media can provide entertainment and educational opportunities, these apps have also been misused to harm kids and promote destructive acts, such as vandalism in schools, deadly viral challenges, bullying, eating disorders, manipulative influencer marketing and grooming," the Senate committee said of this week's hearing, in a statement.
Witnesses at Tuesday's hearing include Jennifer Stout, Snap vice president of global public policy; Michael Beckman, TikTok vice president and head of public policy for the Americas; and Leslie Miller, YouTube vice president of government affairs and public policy. Frances Haugen, a former data scientist at Facebook, testified before the Senate committee on October 5 that the company has long known about the platform's negative influence on children.
Pool Photo by Drew Angerer/UPI
"This hearing will examine how tech companies treat young audiences, including how algorithms and product design choices can amplify harms, addiction and intrusions into privacy. The hearing will also explore needed improvements to our laws to protect children and teenagers online."
For an idea of TikTok's rising popularity, data last month from analytics company App Annie showed that the short-form video platform surpassed YouTube in the average time users with Android smartphones spent watching videos. The TikTok average was more than 24 hours per month, compared to less than 23 hours for Google-owned YouTube.
The platforms' ages show the impact of younger users. TikTok was started in 2016 while YouTube, which recently launched YouTube Shorts to better compete with TikTok, has been around since 2005. Snapchat, which allows users to send time-limited images and videos, has also launched features that allow users to create short-form content.
Facebook-owned Instagram also includes features that allow users to create the same kinds of content commonly found on TikTok, Snapchat and YouTube.
Last month, Sens. Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn., of the Senate commerce committee said the panel on consumer protection and product safety would investigate Haugen's claims that Facebook knew about its platforms' negative impact on young users.
"Recent revelations about harm to kids online show that Big Tech is facing its Big Tobacco moment -- a moment of reckoning," Blumenthal said in an Oct. 19 statement.
"TikTok is an especially egregious offender, both because they make the personal information of all TikTok users available to the communist Chinese government," Blackburn added.
"And because the app pushes sexually explicit and drug-related content onto children."
Senator: YouTube, TikTok, Snap offering only small changes
by Marcy Gordon
YouTube, TikTok and Snapchat are offering only "tweaks and minor changes" in their operations to ensure young users' safety amid rising concern over the platforms' potential harm to children, the head of a Senate panel told the companies' executives Tuesday.
"Everything you do is to add more eyeballs, especially kids', and keep them on your platforms for longer," Sen. Richard Blumenthal, D-Conn., said at the start of a hearing by the Senate Commerce subcommittee on consumer protection that he heads.
The panel took testimony recently from a former Facebook data scientist, who laid out internal company research showing that the company's Instagram photo-sharing service appears to seriously harm some teens. The subcommittee is widening its focus to examine other tech platforms, with millions or billions of users, that also compete for young people's attention and loyalty.
"We're hearing the same stories of harm" caused by YouTube, TikTok and Snapchat, Blumenthal said.
"This is for Big Tech a Big Tobacco moment ... It is a moment of reckoning," he said. "There will be accountability. This time is different."
The three executives—Michael Beckerman, a TikTok vice president and head of public policy for the Americas; Leslie Miller, vice president for government affairs and public policy of YouTube's owner Google; and Jennifer Stout, vice president for global public policy of Snapchat parent Snap Inc.—testified at the subcommittee hearing.
"We took action on more than 7 million accounts in the first three quarters of 2021 when we learned they may belong to a user under the age of 13—3 million of those in the third quarter alone—as we have ramped up our automated removal efforts," Miller said.
TikTok has tools in place, such as screen time management, to help young people and parents moderate how long children spend on the app and what they see, Beckerman said. "We are determined to work hard and keep the platform safe," he said.
The company says it focuses on age-appropriate experiences, noting that some features, such as direct messaging, are not available to younger users. The video platform, wildly popular with teens and younger children, is owned by the Chinese company ByteDance. In only five years since launching, it has gained an estimated 1 billion monthly users.
The three platforms are woven into the fabric of young people's lives, often influencing their dress, dance moves and diet, potentially to the point of obsession. Peer pressure to get on the apps is strong. Social media can offer entertainment and education, but platforms have been misused to harm children and promote bullying, vandalism in schools, eating disorders and manipulative marketing, lawmakers say.
The panel wants to learn how algorithms and product designs can magnify harm to children, foster addiction and intrusions of privacy. The aim is to develop legislation to protect young people and give parents tools to protect their children.
The company says it stores all TikTok U.S. data in the United States. The company also rejects criticisms of promoting harmful content to children.
Early this year after federal regulators ordered TikTok to disclose how its practices affect children and teenagers, the platform tightened its privacy practices for the under-18 crowd.
A separate House committee has investigated video service YouTube Kids this year. Lawmakers said the YouTube offshoot feeds children inappropriate material in a "wasteland of vapid, consumerist content" so it can serve ads to them. The app, with both video hosting and original shows, is available in about 70 countries.
A panel of the House Oversight and Reform Committee told YouTube CEO Susan Wojcicki that the service doesn't do enough to protect children from potentially harmful material. Instead it relies on artificial intelligence and self-policing by content creators to decide which videos make it onto the platform, the panel's chairman said in a letter to Wojcicki.
Parent company Google agreed to pay $170 million in 2019 settlements with the Federal Trade Commission and New York state of allegations that YouTube collected personal data on children without their parents' consent.
Despite changes made after the settlements, the lawmaker's letter said, YouTube Kids still shows ads to children.
YouTube says it has worked to provide children and families with protections and parental controls like time limits, to limit viewing to age-appropriate content. It emphasizes that the 2019 settlements involved the primary YouTube platform, not the kids' version.
Snap Inc.'s Snapchat service allows people to send photos, videos and messages that are meant to quickly disappear, an enticement to its young users seeking to avoid snooping parents and teachers. Hence its "Ghostface Chillah" faceless (and word-less) white logo.
Only 10 years old, Snapchat says an eye-popping 90% of 13- to 24-year-olds in the U.S. use the service. It reported 306 million daily users in the July-September quarter.
The company agreed in 2014 to settle the FTC's allegations that it deceived users about how effectively the shared material vanished and that it collected users' contacts without telling them or asking permission. The messages, known as "snaps," could be saved by using third-party apps or other ways, the regulators said.
Snapchat wasn't fined but agreed to establish a privacy program to be monitored by an outside expert for the next 20 years—similar to oversight imposed on Facebook, Google and Myspace in privacy settlements in recent years.
Facebook froze as anti-vaccine comments swarmed users
by David Klepper and Amanda Seitz
In March, as claims about the dangers and ineffectiveness of coronavirus vaccines spun across social media and undermined attempts to stop the spread of the virus, some Facebook employees thought they had found a way to help.
By subtly altering how posts about vaccines are ranked in people's newsfeeds, researchers at the company realized they could curtail the misleading information individuals saw about COVID-19 vaccines and offer users posts from legitimate sources like the World Health Organization.
"Given these results, I'm assuming we're hoping to launch ASAP," one Facebook employee wrote in March, responding to the internal memo about the study.
Instead, Facebook shelved some suggestions from the study. Other changes weren't made until April.
When another Facebook researcher suggested disabling comments on vaccine posts in March until the platform could do a better job of tackling anti-vaccine messages lurking in them, that proposal was ignored.
Critics say Facebook was slow to act because it worried it might impact the company's profits.
"Why would you not remove comments? Because engagement is the only thing that matters," said Imran Ahmed, the CEO of the Center for Countering Digital Hate, an internet watchdog group. "It drives attention and attention equals eyeballs and eyeballs equal ad revenue."
In an emailed statement, Facebook said it has made "considerable progress" this year with downgrading vaccine misinformation in users' feeds.
Facebook's internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen's legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including The Associated Press.
The trove of documents shows that in the midst of the COVID-19 pandemic, Facebook carefully investigated how its platforms spread misinformation about life-saving vaccines. They also reveal rank-and-file employees regularly suggested solutions for countering anti-vaccine misinformation on the site, to no avail. The Wall Street Journal reported on some of Facebook's efforts to deal with antivaccine comments last month.
The inaction raises questions about whether Facebook prioritized controversy and division over the health of its users.
"These people are selling fear and outrage," said Roger McNamee, a Silicon Valley venture capitalist and early investor in Facebook who is now a vocal critic. "It is not a fluke. It is a business model."
Typically, Facebook ranks posts by engagement—the total number of likes, dislikes, comments and reshares. That ranking scheme may work well for innocuous subjects like recipes, dog photos or the latest viral singalong. But Facebook's own documents show that when it comes to divisive, contentious issues like vaccines, engagement-based ranking only emphasizes polarization, disagreement and doubt.
To study ways to reduce vaccine misinformation, Facebook researchers changed how posts are ranked for more than 6,000 users in the U.S., Mexico, Brazil and the Philippines. Instead of seeing posts about vaccines that were chosen based on their engagement, these users saw posts selected for their trustworthiness.
The results were striking: a nearly 12% decrease in content that made claims debunked by fact-checkers and an 8% increase in content from authoritative public health organizations such as the WHO or U.S. Centers for Disease Control.
Employees at the company reacted with exuberance, according to internal exchanges.
"Is there any reason we wouldn't do this?" one Facebook employee wrote in response.
Facebook said it did implement many of the study's findings—but not for another month, a delay that came at a pivotal stage of the global vaccine rollout.
In a statement, company spokeswoman Dani Lever said the internal documents "don't represent the considerable progress we have made since that time in promoting reliable information about COVID-19 and expanding our policies to remove more harmful COVID and vaccine misinformation."
The company also said it took time to consider and implement the changes.
Yet the need to act urgently couldn't have been clearer: At that time, states across the U.S. were rolling out vaccines to their most vulnerable—the elderly and sick. And public health officials were worried. Only 10% of the population had received their first dose of a COVID-19 vaccine. And a third of Americans were thinking about skipping the shot entirely, according to a poll from The Associated Press-NORC Center for Public Affairs Research.
Despite this, Facebook employees acknowledged they had "no idea" just how bad anti-vaccine sentiment was in the comments sections on Facebook posts. But company research in February found that as much as 60% of the comments on vaccine posts were anti-vaccine or vaccine reluctant.
Even worse, company employees admitted they didn't have a handle on catching those comments, or a policy in place to take them down.
"Our ability to detect (vaccine hesitancy) in comments is bad in English—and basically non-existent elsewhere," another internal memo posted on March 2 said.
Los Angeles resident Derek Beres, an author and fitness instructor, sees anti-vaccine content thrive in the comments every time he promotes immunizations on his accounts on Instagram, which is owned by Facebook. Last year, Beres began hosting a podcast after noticing conspiracy theories about COVID-19 and vaccines were swirling on the social media feeds of health and wellness influencers.
Earlier this year, when Beres posted a picture of himself receiving the COVID-19 shot, some on social media told him he would likely drop dead in six months' time.
"The comments section is a dumpster fire for so many people," Beres said.
Some Facebook employees suggesting disabling all commenting on vaccine posts while the company worked on a solution.
"Very interested in your proposal to remove ALL in-line comments for vaccine posts as a stopgap solution until we can sufficiently detect vaccine hesitancy in comments to refine our removal," one Facebook employee wrote on March 2.
The suggestion went nowhere.
Instead, Facebook CEO Mark Zuckerberg announced on March 15 that the company would start labeling posts about vaccines that described them as safe.
The move allowed Facebook to continue to get high engagement—and ultimately profit—off anti-vaccine comments, said Ahmed of the Center for Countering Digital Hate.
"Facebook has taken decisions which have led to people receiving misinformation which caused them to die," Ahmed said. "At this point, there should be a murder investigation."
Mark Zuckerberg reportedly helped reinstate a video that falsely claims abortion is ‘never medically necessary,’ caving to complaints of Republican politicians
17 news outlets reported further findings from Facebook whistleblower Frances Haugen's leaked internal documents.
Zuckerberg helped reinstate an inaccurate anti-abortion video after Republican politicians protested its removal, records reviewed by the Financial Times reveal.
Employees repeatedly urged Facebook execs to stop bending the rules for politicians, the report says.
New revelations from Facebook whistleblower Frances Haugen's leaked internal documents show that employees repeatedly urged CEO Mark Zuckerberg to stop bending content moderation rules for politicians, The Financial Times reported.
Among The Facebook Papers is an internal memo that alleges Zuckerberg was directly involved in a 2019 decision to reinstate a video that falsely claimed abortions are "never medically necessary," according to documents reviewed by the FT.
The inaccurate post was originally removed by a Facebook moderator, leading to backlash from Republican figureheads, the report says.
Facebook's decision to put the post back up was one of several examples cited by employees who claim executives intervene in misinformation removal "when they see that they could harm powerful political actors," according to The Financial Times.
Facebook did not immediately respond to Insider's request for comment.
Facebook spokesperson Joe Osborne told The Financial Times that, "at the heart of these stories is a premise which is false."
"Yes, we're a business and we make profit, but the idea that we do so at the expense of people's safety or wellbeing misunderstands where our own commercial interests lie," the statement continued. "The truth is we've invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook."
The anti-abortion video referenced in Haugen's latest leak is likely a speech made by anti-abortion campaigner Lila Rose, who falsely claimed "abortion is never medically necessary" on the platform in 2019. Three doctors working for Health Feedback said that Rose's claim was inaccurate, explaining that conditions such as placenta previa and HELLP syndrome can make an abortion necessary to prevent the mother's death.
Facebook initially attached the doctors' fact-check onto the video and restricted the page's distribution, prompting Rose to accuse it of censorship, Insider's Isobel Hamilton previously reported.
Four Republican senators, including Ted Cruz and Josh Hawley, wrote to Mark Zuckerberg in response, arguing that the doctors who wrote the fact-check were not impartial, therefore violating the code of conduct established by the International Fact-Checking Network.
"The science of medicine is not subjective, and a strongly held personal belief should never outweigh scientific evidence, override standards of medical care, or drive policy that puts a person's health and life at risk," The American College of Obstetricians and Gynecologists and Physicians for Reproductive Health said in a statement.
"As physicians, we are focused on protecting the health and lives of the patients for whom we provide care. Without question, abortion can be medically necessary."
Facebook Is Even Worse Than Anyone Imagined
A trove of internal documents reveal a company that is willfully spreading misinformation and hate in pursuit of profit and growth.
Speaking to Recode earlier this year, Facebook founder Mark Zuckerberg made his case for Facebook. “I think if you look at the grand arc here, what’s really happening is individuals are getting more power and more opportunity to create the lives and the jobs that they want,” Zuckerberg told Casey Newton. “And to connect with people they want. And to connect to the ideas that they want and to share the ideas that they want. And I just think that that will lead to a better world. It will be different from the world that we had before. I think it will be more diverse, I think more different ideas and models will be able to exist.”
But Zuckerberg suggested that a problem remained. By empowering individuals, Facebook was disrupting a hierarchy that had existed for generations: “My concern is that we’re too frequently telling the negative sides of it, from the perspective of the institutions that may be not on the winning side of these changes.” For the last several years, Facebook had been besiegedby a narrativethatits productswere makingthe worldan uglierand divisive place. Here, Zuckerberg inverted the critique: The real victims of Facebook’s rise weren’t its users but a number of dusty institutions that were raging as their power was being redistributed to the people. In this version of events, Facebook wasn’t just empowering its users, it was liberating them.
Over the last few days, that pretty little picture has taken a serious hit, as several news organizations have begun reporting out revelations from internal Facebook documents provided to them by whistleblower (and crypto enthusiast) Frances Haugen, who worked at the company for two years before leaving in May. These documents illuminate a Facebook that is the opposite of Zuckerberg’s rose-tinted view—a company that knowingly provides a product that is used to spread misinformation and hate; that is used to facilitate hate speech, terrorism, and sex trafficking; and whose meager efforts to stop these things have often failed—as they did, most notably, in the lead-up to the January 6 insurrection.
These are the most damning internal leaks from Facebook to come to light yet—more hair-raising than the revelations contained within Sheera Frenkel’s insider account, An Ugly Truth: Facebook’s Battle for Domination; perhaps even more damning than the Cambridge Analytica scandal that rocked the social network three years ago. These new disclosures reveal that Facebook executives know exactly what their product is doing to its users—and by extension, to the world—and are aware of the inadequacy of their halting efforts to mitigate these society-damaging impacts. In almost every instance, the company privileged its own profits, growth, and—in particular—its efforts to boost its anemic popularity with young people over the health and well-being of its user base.
“Time and again, [researchers at Facebook] determined that people misused key features or that those features amplified toxic content, among other effects,” reported The New York Times’ Mike Isaac. “In an August 2019 internal memo, several researchers said it was Facebook’s “core product mechanics”—meaning the basics of how the product functioned—that had let misinformation and hate speech flourish on the site. ‘The mechanics of our platform are not neutral,’ they concluded.”
It’s hard to think of a more damning determination: Facebook’s product inevitably led to the spread of hate speech and misinformation. But this conclusion is inescapable when you look at other findings. Election misinformation continued to spread and proliferate rapidly in the aftermath of the 2020 election; one data scientist warned that 10 percent of content viewed in the wake of the election alleged widespread fraud. Facebook discovered its product would spin up recommendations of QAnon content to users who merely showed interest in conservative topics within a matter of days. “The body of research consistently found Facebook pushed some users into ‘rabbit holes,’ increasingly narrow echo chambers where violent conspiracy theories thrived,” NBC News reported. “People radicalized through these rabbit holes make up a small slice of total users, but at Facebook’s scale, that can mean millions of individuals.”
The documents also found that Facebook’s efforts to stop anti-vax misinformation from spreading were often wildly deficient and that the company was slow to understand just how woeful its response was—an especially shocking revelation given that the last five years have consistently and repeatedly demonstrated that the platform is overtaken by misinformation with an ease that suggests it was built to do just that.
The situation in the rest of the world, meanwhile, is worse than in America. A handful of countries—the United States being one—are given extensive content moderation attention. While these moderation efforts are often inadequate, they’re significantly better than what most of the world gets. Researchers have found that Facebook has been used to spread everything from hate speech to ethnic cleansing. Mark Zuckerberg, meanwhile, has intervened on behalf of authoritarian governments: Given the choice between helping Vietnam’s autocratic government censor posts and ceasing to do business in the country, he personally elected to go the former route. Again and again, you see Facebook making this choice, in a slew of different ways: The company always chooses profits and growth, even when that choice demonstrably sows discord, spreads misinformation of violent incitement, or makes the world a worse place.
Most striking of all is the pervasive sense that desperation drives many of these choices; that Facebook is always amid some existential threat even as growth is pushed to the exclusion of all other considerations. Facebook has lately struggled to attract younger users, and the company is doing everything it can to reverse that trend, a choice that predictably leads to devastating results that likely exacerbate the original problem. That is, I suppose, a silver lining—it suggests that Facebook may organically at some point cease being as important to the world as it is now. But that moment, if it comes, is a long, long way away—and if the company is truly in some sort of doom loop, there’s no telling what will govern its next decision or what harm the next quick fix or incautious action might unleash. For now, it is enough to note that the company is clearly doing great harm to the entire world and it is doing so knowingly.
Alex Shephard is a staff writer at The New Republic.
Zuckerberg Slams 'Coordinated Effort' to Badmouth Facebook
During an earnings call, Zuckerberg briefly mentioned introspection, but his remarks, forceful and angry at times, didn’t actually take responsibility for anything.
In prepared remarks during Facebook’s Q3 earnings call today, Zuckerberg blamed the recent avalanche of bad press for the social network on “a coordinated effort to selectively use leaked documents to paint a false picture of our company.”
Facebook’s CEO doubled down, running through a checklist of things the company has done around safety and security, projects that will come with a $5 billion price tag in 2021. “It makes a good soundbite to say that we don't solve these impossible tradeoffs because we're just focused on making money, but the reality is these questions are not primarily about our business, but about balancing difficult social values,” he said.
Zuckerberg briefly touched on introspection; “I worry about the incentives that we're creating for other companies to be as introspective as we have been,” he said. But his remarks—forceful and angry at times—didn’t really look inward or take responsibility for anything. Instead, the media, lawmakers, and society as a whole are to blame, he argued.
Polarization has been around in the US since “before I was born” in 1984, according to Zuckerberg. “If social media is not the main driver of these issues, then it probably can't fix them by itself either,” he said.
“I've repeatedly called for regulation to provide clarity because I don't think companies should be making so many of these decisions ourselves,” Zuckerberg said. But that’s a convenient fallback—glacial is Congress’ favorite speed, and an army of lobbyists will be at the ready should lawmakers make a move.
Whistleblower Frances Haugen, a former product manager at Facebook, is among those trying to get politicians to act. She testified before lawmakers in London today, shortly after several high-profile news outlets posted stories about documents from the social network, which they dubbed the Facebook Papers.
Zuckerberg said today that Facebook “can't change the underlying media dynamics,” and said his focus is on people, not the press. “The reason we've been able to succeed for almost two decades is because we keep evolving and building. Facebook started in a dorm room and grew into a global website. We invented the News Feed and a new kind of ads platform. We became a mobile-first experience. And then we grew a whole family of apps that serve billions of people."
And yet, the company has struggled to stop Groups that spread dangerous misinformation, in part because of a recommendation system that quickly veers off into questionable territory, Facebook’s own employees found. It’s all too easy for online chatter to spill over into real-world violence, in the US and abroad.
Polishing the Turd: Facebook So Toxic Zuckerberg Reportedly Might Change Its Name (8 Guesses For a Suitable Replacement)
Facebook is where social media basically began, but it appears an end is in sight – at least for its iconic name. The company – and its CEO Mark Zuckerberg – have come under intense scrutiny lately because of the rapid-fire spread of misinformation and hate speech on the platform. But rather than address those problems, Facebook and the social media apps it owns, including Instagram and WhatsApp, may become subsidiaries of another, totally new company.
The rebranding is a transparent attempt to wash the stink off the site, which has become a breeding ground for bad actors. And yet, they’re probably going to try to spin this as one step closer to Zuckerberg’s goal of “building the metaverse” (whatever that means).
Rumor has it the new name will be revealed at or before Facebook’s Connect conference on Oct. 28. Rather than wait and see what the marketing geniuses come up with, we’ve brainstormed eight of our own ideas, based on how people actually use the platform.
These are our guesses for a suitable name replacement for Facebook.
Columbia Attorney General Karl Racine on Wednesdayannouncedthe addition of Facebook CEO Mark Zuckerberg as a defendant to an existing lawsuit over the Cambridge Analytica scandal, where the data firm harvested information from as many as 87 million individuals without their knowledge.
In 2018, AG Racine brought suit against Facebook alleging misrepresentation of its policies around third-party data access and compromising user privacy with deceptive trade practices under the District of Columbia’s Consumer Protection Procedures Act (CPPA). Despite Facebook’s efforts to dismiss the case in 2019, a District judge allowed the civil suit to proceed.
After reviewing “hundreds of thousands of pages of documents produced in litigation and [completing] a wide range of depositions including former employees and whistleblowers,” AG Racine added Zuckerberg to the lawsuit on Wednesday. Within the CPPA there is a provision stating, “individuals are liable for actions of a company if they were aware of them at the time.” AG Racine did not specify to what extend Zuckerberg was aware of the scandal and only stated Zuckerberg was “personally involved” in Cambridge Anayltica’s collection of user data.
AG Racine concluded his message by stating, “This lawsuit is about protecting the data of half of all District residents and tens of millions of people across the country. We’ve taken our obligation to investigate wrongdoing very seriously—and Facebook should take its responsibility to protect users just as seriously.”