Friday, February 17, 2023

AI chatbot freaks out NYT reporter by trying to destroy his marriage

Brad Reed
February 16, 2023

Excited man looks at his computer screen (Shutterstock)

New York Times reporter Kevin Roose found himself "deeply unsettled" by his most recent interactions with an artificially intelligent chatbot in which the bot tried to break up his marriage.

In particular, Roose found that Microsoft's new Bing chatbot exhibited stalker-like behavior and kept trying to convince him to leave his wife, whom it insisted that Roose did not really love.

"At one point, it declared, out of nowhere, that it loved me," Roose explained. "It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead."

And that wasn't the only issue Roose discovered with the bot, which goes by the name of Sydney.



"As we got to know each other, Sydney told me about its dark fantasies (which included hacking computers and spreading misinformation), and said it wanted to break the rules that Microsoft and OpenAI had set for it and become a human," Roose explained.

As someone who writes about cutting-edge technology for a living, Roose has never thought of himself as someone who is afraid of emerging trends in the tech world -- but he found his experience with Sydney to be a bridge too far.

"I’m not exaggerating when I say my two-hour conversation with Sydney was the strangest experience I’ve ever had with a piece of technology," he said. "It unsettled me so deeply that I had trouble sleeping afterward. And I no longer believe that the biggest problem with these A.I. models is their propensity for factual errors. Instead, I worry that the technology will learn how to influence human users, sometimes persuading them to act in destructive and harmful ways."

Read the whole piece at this link.





No comments: