Wednesday, November 20, 2024

Op-Ed: Maybe polarizing social media was an even dumber idea than it looks

By Paul Wallis
November 20, 2024
DIGITAL JOURNAL

Elon Musk's X. — © AFP

The hordes of people leaving X are a sort of statistical obituary to years of propaganda. X is in big trouble, mainly because of its policies and algorithms and a bizarre relationship with infuriated advertisers.

Like most of American media, X instantly demoted itself to half of its own market share with its politics. It also became a servant, like FOX, to one side only. That’s not working out too well.

You’ll also notice from media coverage that nobody’s questioning the self-decapitation and disembowelment of X. The other glaring problem is that somehow this situation is now seen as normal.

It isn’t normal. People and advertisers are voting with their dollars and clicks.

Disgruntled X users are now heading to Bluesky in vast numbers. Millions of people have basically abandoned X. Bluesky looks a bit like early Twitter. Seems quite OK as an environment. I haven’t seen a rabid lunatic yet.

The new and huge problem is the quality of information vs communication. This is now an abyss of opposing information.

Unless there’s some middle ground, and the medium isn’t hopelessly biased, social media has just machinegunned itself in the foot. Politics is not the sole interest of the world. Other things happen, too, y’know.

That’s where audience loss is likely to be fatal. What if you don’t want to read about The Adventures of Donny and Elon in Disneyland?

What if you want a broad and useful mix of your own interests, instead?

The sole and whole purpose of social media is communication. Reducing your content range to such a narrow focus means you inevitably lose users.

Nor is the Chicken Little approach to information exactly popular. Nobody listens to raving lunatics if they don’t have to.

The possible exception to that theory is screen-fed America. Markets and media businesses take a long time to change course. This market doesn’t eat solid food anymore. The child-psychology is pretty obvious. These sources will probably simply continue to produce pablum.

The total stagnation of American mass media was one thing. This social media situation is stagnation of real-time information as well. X has made itself useless to its users.

People obviously don’t like that. The move to Bluesky is self-defense. The social media market can blame itself for a newcomer just walking off with its customers.

Let’s talk “dumb”.

Billions of dollars of investment are now evaporating in a festering social media environment and those billions aren’t coming back.

The World’s Richest Sudden Instant Fan Boy doesn’t have the excuse of being geriatric or illiterate. He should have enough metrics to see the cliff coming.

The big money markets in the US are all blue. The black holes are all red, with the possible exception of Texas. Ignore those big blue markets at your peril. Lose those markets and you’re doing hillbilly scale dollars. These much smaller markets can’t deliver the same value. Advertisers and marketers know that.

The rest of the world is also reacting very negatively to the toxicity of social media. The hubris and hype are all American-generated. The rest of the world can easily ignore most of this garbage.

What if reality gets involved in this mess? Economic risks and social disintegration are circling like buzzards over America in huge numbers with dollar signs on them. All types of media should be building financial fallout shelters about now. Markets can evaporate overnight.

No amount of fake news hysteria and self-congratulation can make an impression on that situation. You need “Make America Solvent Again” to do that, and it looks like that’s not happening for the next four years.

How would you read a commercial market that is basically suicidal? Would you focus on customer retention in the afterlife? You might have to do that.

Social media needs a functional market to exist at all. If the money’s pulling out, the message couldn’t be clearer.

__________________________________________________
Disclaimer

The opinions expressed in this Op-Ed are those of the author. They do not purport to reflect the opinions or views of the Digital Journal or its members.


Social media users probably won’t read beyond this headline, researchers say



A new study of 35 million news links circulated on Facebook reports that more than 75% of the time they were shared without the link being clicked upon and read



Penn State




UNIVERSITY PARK, Pa. — Congratulations. Reading this far into the story is a feat not many will accomplish, especially if shared on Facebook, according to a team led by Penn State researchers. In an analysis of more than 35 million public posts containing links that were shared extensively on the social media platform between 2017 and 2020, the researchers found that around 75% of the shares were made without the posters clicking the link first. Of these, political content from both ends of the spectrum was shared without clicking more often than politically neutral content.

The findings, which the researchers said suggest that social media users tend to merely read headlines and blurbs rather than fully engage with core content, appeared today (Nov. 19) in Nature Human Behavior. While the data were limited to Facebook, the researchers said the findings could likely map to other social media platforms and help explain why misinformation can spread so quickly online.

“It was a big surprise to find out that more than 75% of the time, the links shared on Facebook were shared without the user clicking through first,” said corresponding author S. Shyam Sundar, Evan Pugh University Professor and the James P. Jimirro Professor of Media Effects at Penn State. “I had assumed that if someone shared something, they read and thought about it, that they’re supporting or even championing the content. You might expect that maybe a few people would occasionally share content without thinking it through, but for most shares to be like this? That was a surprising, very scary finding.”

Access to the Facebook data was granted via Social Science One, a research consortium hosted by Harvard University’s Institute for Quantitative Social Science focused on obtaining and sharing social and behavioral data responsibly and ethically. The data were provided in collaboration with Meta, Facebook’s parent company, and included user demographics and behaviors, such as a “political page affinity score.” This score was determined by external researchers identifying the pages users follow — like the accounts of media outlets and political figures. The researchers used the political page affinity score to assign users to one of five groups — very liberal, liberal, neutral, conservative and very conservative.

To determine the political content of shared links, the researchers in this study used machine learning, a form of artificial intelligence, to identify and classify political terms in the link content. They scored the content on a similar five-point political affinity scale, from very liberal to very conservative, based on how many times each affinity group shared the link.

"We created this new variable of political affinity of content based on 35 million Facebook posts during election season across four years. This is a meaningful period to understand macro-level patterns behind social media news sharing,” said co-author Eugene Cho Snyder, assistant professor of humanities and social sciences at New Jersey Institute of Technology

The team validated the political affinity of news domains, such as CNN or Fox, based on the media bias chart produced by AllSides, an independent company focused on helping people understand the biases of news content, and a ratings system developed by researchers at Northeastern University.

With these rating systems, the team manually sorted 8,000 links, first identifying them as political or non-political content. Then the researchers used this dataset to train an algorithm that assessed 35 million links shared more than 100 times on Facebook by users in the United States.

“A pattern emerged that was confirmed at the level of individual links,” Snyder said. “The closer the political alignment of the content to the user — both liberal and conservative — the more it was shared without clicks. … They are simply forwarding things that seem on the surface to agree with their political ideology, not realizing that they may sometimes be sharing false information.”

The findings support the theory that many users superficially read news stories based just on headlines and blurbs, Sundar said, explaining that Meta also provided data from its third-party fact-checking service — which identified that 2,969 of the shared URLs linked to false content.

The researchers found that these links were shared over 41 million times, without being clicked. Of these, 76.94% came from conservative users and 14.25% from liberal users. The researchers explained that the vast majority — up to 82% — of the links to false information in the dataset originated from conservative news domains.

To cut down on sharing without clicking, Sundar said that social media platforms could introduce “friction” to slow the share, such as requiring people to acknowledge that they have read the full content prior to sharing.

“Superficial processing of headlines and blurbs can be dangerous if false data are being shared and not investigated,” Sundar said, explaining that social media users may feel that content has already been vetted by those in their network sharing it, but this work shows that is unlikely. “If platforms implement a warning that the content might be false and make users acknowledge the danger in doing so, that might help people think before sharing.”

This wouldn’t stop intentional misinformation campaigns, Sundar said, and individuals still have a responsibility to vet the content they share.

“Disinformation or misinformation campaigns aim to sow the seeds of doubt or dissent in a democracy — the scope of these efforts came to light in the 2016 and 2020 elections,” Sundar said. “If people are sharing without clicking, they’re potentially playing into the disinformation and unwittingly contributing to these campaigns staged by hostile adversaries attempting to sow division and distrust.”

So, why do people share without clicking in the first place?

“The reason this happens may be because people are just bombarded with information and are not stopping to think through it,” Sundar said. “In such an environment, misinformation has more of a chance of going viral. Hopefully, people will learn from our study and become more media literate, digitally savvy and, ultimately, more aware of what they are sharing.”

Other collaborators on this paper include Junjun Yin and Guangqing Chi, Penn State; Mengqi Liao, University of Georgia; and Jinping Wang, University of Florida.

The Social Science Research Council, New York, supported this research.

No comments: