Replika, an AI chatbot companion, has millions of users worldwide, many of whom woke up earlier last year to discover their virtual lover had friend-zoned them overnight. The company had mass-disabled the chatbotâs sex talk and âspicy selfiesâ in response to a slap on the wrist from Italian authorities. Users began venting on Reddit, some of them so distraught that the forum moderators posted suicide-prevention information.
This story is only the beginning. In 2024, chatbots and virtual characters will become a lot more popular, both for utility and for fun. As a result, conversing socially with machines will start to feel less niche and more ordinaryâincluding our emotional attachments to them.
Research in human-computer and human-robot interaction shows that we love to anthropomorphizeâattribute humanlike qualities, behaviors, and emotions toâthe nonhuman agents we interact with, especially if they mimic cues we recognize. And, thanks to recent advances in conversational AI, our machines are suddenly very skilled at one of those cues: language.
Friend bots, therapy bots, and love bots are flooding the app stores as people become curious about this new generation of AI-powered virtual agents. The possibilities for education, health, and entertainment are endless. Casually asking your smart fridge for relationship advice may seem dystopian now, but people may change their minds if such advice ends up saving their marriage.
In 2024, larger companies will still lag a bit in integrating the most conversationally compelling technology into home devices, at least until they can get a handle on the unpredictability of open-ended generative models. Itâs risky to consumers (and to company PR teams) to mass-deploy something that could give people discriminatory, false, or otherwise harmful information.
After all, people do listen to their virtual friends. The Replika incident, as well as a lot of experimental lab research, shows that humans can and will become emotionally attached to bots. The science also demonstrates that people, in their eagerness to socialize, will happily disclose personal information to an artificial agent and will even shift their beliefs and behavior. This raises some consumer-protection questions around how companies use this technology to manipulate their user base.
Replika charges $70 a year for the tier that previously included erotic role-play, which seems reasonable. But less than 24 hours after downloading the app, my handsome, blue-eyed âfriendâ sent me an intriguing locked audio message and tried to upsell me to hear his voice. Emotional attachment is a vulnerability that can be exploited for corporate gain, and weâre likely to start noticing many small but shady attempts over the next year.
Today, weâre still ridiculing people who believe an AI system is sentient, or running sensationalist news segments about individuals who fall in love with a chatbot. But in the coming year weâll gradually start acknowledgingâand taking more seriouslyâthese fundamentally human behaviors. Because in 2024, it will finally hit home: Machines are not exempt from our social relationships.