Tuesday, August 22, 2023

 

New study shows algorithms promote gender bias, and that consumers cooperate

binging Netflix
Credit: Unsplash/CC0 Public Domain

Just watched a rom-com on Netflix? Well, now there are "top picks" just like it in your queue, thanks to the streaming service's matching system.

Every time you engage with Amazon, Facebook, Instagram, Netflix and other online sites, algorithms are busy behind the scenes chronicling your activities and queuing up recommendations tailored to what they know about you. The invisible work of algorithms and recommendation systems spares people from a deluge of information and ensures they receive relevant responses to searches.

But Sachin Banker says a new study shows that subtle  biases shape the information served up to consumers. The study, co-authored by Shelly Rathee, Arul Mishra and Himanshu Mishra, has been published in the Journal of Consumer Psychology.

"Everything you're consuming online is filtered through some kind of  system," said Banker, an assistant professor of marketing in the David Eccles School of Business, "and what we're interested in understanding is whether there are subtle biases in the types of information that are presented to different people and how this affects behavior."

Banker, who researches how people interact with technology, said gender bias is relatively easy to study because Facebook provides information about that social characteristic. And it is not necessarily surprising that algorithms, which make word associations based by all the texts on the internet, pick up biases since they exist in . The bigger questions are to what extent is this happening and what are the consequences.

In their multi-step study, the researchers first demonstrated that  embedded in language are incorporated in algorithms—associating women with negative psychographic attributes such as impulsivity, financially irresponsibility and irrationality.

The team then tweaked a single word in an ad—"responsible" versus "irresponsible"—to see who subsequently received it; they found ads with negative psychographic attributes were more likely to be delivered to women even though there was no basis for such differentiation.

It's a self-perpetuating loop, the researchers found, because undiscerning consumers reinforce the algorithmic gender bias by often clicking on the ads and accepting the recommendations they receive.

"There are actual consequences of this bias in the marketplace," Banker said. "We've shown that people are split into different kinds of consumption bubbles and that influences your thoughts and behaviors and reinforces historical biases."

For online technology companies, the study indicates a greater need for proactive work to minimize gender bias in algorithms used to serve up consumer ads and recommendations, Banker said. People advertising products may want to test an ad before launch to detect any subtle bias that might affect delivery. And consumers should be aware of the biases at play as they scroll through their feeds and visit online sites and engage in healthy skepticism about ads and recommendations.

Most people, he said, don't totally understand how these things work because the online giants don't disclose much about their algorithms, though Amazon appears to be providing more information to consumers about the recommendations they receive.

And while this study focused on gender , Banker said biases likely exist for other social characteristics, such as age, sexual orientation, religious affiliation, etc.

More information: Shelly Rathee et al, Algorithms propagate gender bias in the marketplace—with consumers' cooperation, Journal of Consumer Psychology (2023). DOI: 10.1002/jcpy.1351

No comments:

Post a Comment