Tuesday, April 11, 2023

Kids judge Alexa smarter than Roomba, but say both deserve kindness

Four to 11-year-olds deem it wrong to attack either semi-intelligent robot

Peer-Reviewed Publication

DUKE UNIVERSITY

Alexa, You disappoint me 

IMAGE: KIDS AGREE THAT IT’S WRONG TO BE ATTACK SMART TECHNOLOGIES LIKE ROOMBA OR AN ALEXA, DESPITE RANKING AMAZON’S VIRTUAL ASSISTANT AS SAVVIER THAN ITS VACUUMING COUNTERPART. view more 

CREDIT: VERONIQUE KOCH, DUKE UNIVERSITY

DURHAM, N.C. –- Most kids know it’s wrong to yell or hit someone, even if they don’t always keep their hands to themselves. But what about if that someone’s name is Alexa?

A new study from Duke developmental psychologists asked kids just that, as well as how smart and sensitive they thought the smart speaker Alexa was compared to its floor-dwelling cousin Roomba, an autonomous vacuum.

Four- to eleven-year-olds judged Alexa to have more human-like thoughts and emotions than Roomba. But despite the perceived difference in intelligence, kids felt neither the Roomba nor the Alexa deserve to be yelled at or harmed. That feeling dwindled as kids advanced towards adolescence, however. The findings appear online April 10 in the journal Developmental Psychology.

The research was inspired in part by lead author Teresa Flanagan seeing how Hollywood depicts human-robot interactions in shows like HBO’s “Westworld.”

“In Westworld and the movie Ex Machina, we see how adults might interact with robots in these very cruel and horrible ways,” said Flanagan, a visiting scholar in the department of psychology & neuroscience at Duke. “But how would kids interact with them?”

To find out, Flanagan recruited 127 children aged four to eleven who were visiting a science museum with their families. The kids watched a 20-second clip of each technology, and then were asked a few questions about each device.

Working under the guidance of Tamar Kushnir, Ph.D., her graduate advisor and a Duke Institute for Brain Sciences faculty member, Flanagan analyzed the survey data and found some mostly reassuring results.

Overall, kids decided that both the Alexa and Roomba probably aren’t ticklish and wouldn’t feel pain if they got pinched, suggesting they can’t feel physical sensations like people do. However, they gave Alexa, but not the Roomba, high marks for mental and emotional capabilities, like being able to think or getting upset after someone is mean to it.

“Even without a body, young children think the Alexa has emotions and a mind,” Flanagan said. “And it’s not that they think every technology has emotions and minds -- they don’t think the Roomba does -- so it’s something special about the Alexa’s ability to communicate verbally.”

Regardless of the different perceived abilities of the two technologies, children across all ages agreed it was wrong to hit or yell at the machines.

“Kids don’t seem to think a Roomba has much mental abilities like thinking or feeling,” Flanagan said. “But kids still think we should treat it well. We shouldn't hit or yell at it even if it can't hear us yelling.”

The older kids got however, the more they reported it would be slightly more acceptable to attack technology.

“Four- and five-year-olds seem to think you don't have the freedom to make a moral violation, like attacking someone," Flanagan said. “But as they get older, they seem to think it's not great, but you do have the freedom to do it.”

The study’s findings offer insights into the evolving relationship between children and technology and raise important questions about the ethical treatment of AI and machines in general, and as parents. Should adults, for example, model good behavior for their kids by thanking Siri or its more sophisticated counterpart ChatGPT for their help?

For now, Flanagan and Kushnir are trying to understand why children think it is wrong to assault home technology.

In their study, one 10-year-old said it was not okay to yell at the technology because, “the microphone sensors might break if you yell too loudly,” whereas another 10-year-old said it was not okay because “the robot will actually feel really sad.”

“It’s interesting with these technologies because there's another aspect: it’s a piece of property,” Flanagan said. “Do kids think you shouldn't hit these things because it's morally wrong, or because it's somebody's property and it might break?”

This research was supported by the U.S. National Science Foundation (SL-1955280, BCS-1823658).

CITATION: “The Minds of Machines: Children’s Beliefs About the Experiences, Thoughts, and Morals of Familiar Interactive Technologies,” Teresa M. Flanagan, Gavin Wong, Tamar Kushnir. Developmental Psychology, April 10, 2023. DOI: 10.1037/dev0001524.

 

Alexa, You're my Friend

 

No comments:

Post a Comment