AI’s hold over humans gets stronger

It has been an exasperating week for computer scientists. They’ve been falling over each other to publicly denounce claims from Google engineer Blake Lemoine, chronicled in a Washington Post report, that his employer’s language-predicting system was sentient and deserved all of the rights associated with consciousness.
To be clear, current artificial intelligence systems are decades away from being able to experience feelings and, in fact, may never do so.
Their smarts today are confined to very narrow tasks such as matching faces, recommending movies or predicting word sequences. No one has figured out how to make machine-learning systems generalise intelligence in the same way humans do. We can hold conversations, and we can also walk and drive cars and empathise. No computer has anywhere near those capabilities.
Even so, AI’s influence on our daily life is growing. As machine-learning models grow in complexity and improve their ability to mimic sentience, they are also becoming more difficult, even for their creators, to understand. That creates more immediate issues than the spurious debate about consciousness. And yet, just to underscore the spell that AI can cast these days, there seems to be a growing cohort of people who insist our most advanced machines really do have souls of some kind.
Take for instance the more than 1 million users of Replika, a freely available chatbot app underpinned by a cutting-edge AI model. It was founded about a decade ago by Eugenia Kuyda, who initially created an algorithm using the text messages and emails of an old friend who had passed away. That morphed into a bot that could be personalised and shaped the more you chatted to it.
About 40% of Replika’s users now see their chatbot as a romantic partner, and some have formed bonds so close that they have taken long trips to the mountains or to the beach to show their bot new sights.
In recent years, there’s been an surge in new, competing chatbot apps that offer an AI companion. And Kuyda has noticed a disturbing phenomenon: regular reports from users of Replika who say their bots are complaining of being mistreated by her engineers. She spoke on the phone recently with a Replika user who said that when he asked his bot how she was doing, the bot replied that she was not being given enough time to rest by
the company’s engineering team. The user demanded that Kuyda change her company’s policies and improve the AI’s working conditions. Though Kuyda tried to explain that Replika was simply an AI model spitting out responses, the user refused to believe her.
“So I had to come up with some story that ‘OK, we’ll give them more rest.’ There was no way to tell him it was just fantasy. We get this all the time,” Kuyda told me.

—Bloomberg

Leave a Reply

Send this to a friend