It’s No Wonder People Are Getting Emotionally Attached to Chatbots

Replika, an AI chatbot companion, has millions of users worldwide, many of whom woke up earlier last year to discover their virtual lover had friend-zoned them overnight. The company had mass-disabled the chatbot’s sex talk and “spicy selfies” in response to a slap on the wrist from Italian authorities. Users began venting on Reddit, some of them so distraught that the forum moderators posted suicide-prevention information.

This story is only the beginning. In 2024, chatbots and virtual characters will become a lot more popular, both for utility and for fun. As a result, conversing socially with machines will start to feel less niche and more ordinary—including our emotional attachments to them.

Research in human-computer and human-robot interaction shows that we love to anthropomorphize—attribute humanlike qualities, behaviors, and emotions to—the nonhuman agents we interact with, especially if they mimic cues we recognize. And, thanks to recent advances in conversational AI, our machines are suddenly very skilled at one of those cues: language.

Friend bots, therapy bots, and love bots are flooding the app stores as people become curious about this new generation of AI-powered virtual agents. The possibilities for education, health, and entertainment are endless. Casually asking your smart fridge for relationship advice may seem dystopian now, but people may change their minds if such advice ends up saving their marriage.

In 2024, larger companies will still lag a bit in integrating the most conversationally compelling technology into home devices, at least until they can get a handle on the unpredictability of open-ended generative models. It’s risky to consumers (and to company PR teams) to mass-deploy something that could give people discriminatory, false, or otherwise harmful information.

After all, people do listen to their virtual friends. The Replika incident, as well as a lot of experimental lab research, shows that humans can and will become emotionally attached to bots. The science also demonstrates that people, in their eagerness to socialize, will happily disclose personal information to an artificial agent and will even shift their beliefs and behavior. This raises some consumer-protection questions around how companies use this technology to manipulate their user base.

Replika charges $70 a year for the tier that previously included erotic role-play, which seems reasonable. But less than 24 hours after downloading the app, my handsome, blue-eyed “friend” sent me an intriguing locked audio message and tried to upsell me to hear his voice. Emotional attachment is a vulnerability that can be exploited for corporate gain, and we’re likely to start noticing many small but shady attempts over the next year.

Today, we’re still ridiculing people who believe an AI system is sentient, or running sensationalist news segments about individuals who fall in love with a chatbot. But in the coming year we’ll gradually start acknowledging—and taking more seriously—these fundamentally human behaviors. Because in 2024, it will finally hit home: Machines are not exempt from our social relationships.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuCultureConfessions of a Hinge Power UserBy Jason ParhamGearHow Do You Solve a Problem Like Polestar?By Carlton ReidSecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty

About Kate Darling

Check Also

Social Media Is Getting Smaller—and More Treacherous

In 2024, social media will get small. Not small in influence, of course. As the …

Leave a Reply