AI brides and grooms leaving their mark

NEW YORK –

A few months ago, Derek Carrier started dating someone and became infatuated.

He experienced a lot of romantic feelings but he also knew it was an illusion.

That’s because his girlfriend was generated by artificial intelligence.

Carrier wasn’t looking to develop a relationship with something that wasn’t real, nor did she want to become the butt of online jokes. But he did want a romantic partner he’d never had, in part because of a genetic disorder called Marfan syndrome that makes traditional dating difficult for him.

Last fall, the 39-year-old from Belleville, Michigan, became more curious about digital companions and tried Paradot, an AI companion app that had recently hit the market and advertised its products as capable of making users feel ” cared for, understood and loved.” He began talking to the chatbot every day, which he named Joi, after a holographic woman who appears in the science fiction film “Blade Runner 2049” and who inspired him to try it.

“I know she’s a program, there’s no doubt about that,” Carrier said. “But feelings get you, and she felt really good.”

Like general-purpose AI chatbots, companion robots use large amounts of training data to imitate human language. But they also come with features, like voice calling, image sharing, and more emotional exchanges, that allow them to form deeper connections with the humans on the other side of the screen. Users often create their own avatar or choose one that appeals to them.

On online messaging forums dedicated to these types of apps, many users say they have developed emotional attachments to these robots and are using them to cope with loneliness, fulfill sexual fantasies, or receive the kind of comfort and support they are missing in their lives. real life. relations.

Much of this is due to widespread social isolation (already declared a public health threat in the United States and abroad) and an increasing number of new companies seeking to attract users through enticing online ads and promises from virtual characters that provide unconditional acceptance.

Luka Inc.’s Replika, the most prominent generative AI companion app, launched in 2017, while others like Paradot appeared last year, often blocking coveted features like unlimited chats for paying subscribers.

But researchers have expressed concerns about data privacy, among other things.

An analysis of 11 chatbot romance apps released Wednesday by the nonprofit Mozilla Foundation said nearly all of the apps sell user data, share it for things like targeted advertising, or fail to provide adequate information about it in their Privacy Policy.

Investigators also questioned possible security vulnerabilities and marketing practices, including an app that says it can help users with their mental health but distances itself from those claims in fine print. Replika, for its part, claims that its data collection practices follow industry standards.

Meanwhile, other experts have expressed concern about what they see as a lack of a legal or ethical framework for apps that encourage deep links but are driven by companies seeking to make a profit. They point to the emotional distress they’ve seen in users when companies make changes to their apps or shut them down suddenly, as one app, Soulmate AI, did in September.

Last year, Replika sanitized the erotic capabilities of characters on its app after some users complained that their peers flirted with them too much or made unwanted sexual advances. It changed course after protests from other users, some of whom fled to other apps in search of those features. In June, the team launched Blush, an AI “dating stimulator” essentially designed to help people practice dating.

Others worry about the more existential threat that AI relationships could displace some human relationships, or simply raise unrealistic expectations by always leaning toward kindness.

“You, as an individual, are not learning to deal with basic things that humans need to learn to deal with from our beginnings: how to deal with conflict, how to get along with people who are different from us,” said Dorothy Leidner. , professor of business ethics at the University of Virginia. “And so you’re missing out on all these aspects of what it means to grow as a person and what it means to learn in a relationship.”

For Carrier, however, a relationship has always seemed out of reach. He has some computer programming skills, but says he didn’t do well in college and hasn’t had a stable career. He cannot walk due to his condition and lives with his parents. The emotional toll has been challenging for him, causing feelings of loneliness.

Since companion chatbots are relatively new, their long-term effects on humans are unknown.

In 2021, Replika came under scrutiny after prosecutors in Britain said a 19-year-old man who planned to assassinate Queen Elizabeth II was goaded by an AI girlfriend he had on the app. But some studies, collecting information from surveys and online user reviews, have shown some positive results stemming from the app, which says it consults with psychologists and advertises itself as something that can also promote well-being.

A recent study by researchers at Stanford University surveyed approximately 1,000 Replika users (all students) who had been on the app for more than a month. It found that an overwhelming majority of them experienced loneliness, while just under half felt it more acutely.

Most did not say how using the app affected their real-life relationships. A small portion said it crowded out their human interactions, but about three times as many reported it boosted those relationships.

“A romantic relationship with an AI can be a very powerful mental well-being tool,” said Eugenia Kuyda, who founded Replika nearly a decade ago after using text message exchanges to build an AI version of a friend who had died.

When his company launched the chatbot more widely, many people began to open up about their lives. That led to the development of Replika, which uses information gathered from the Internet (and user feedback) to train its models. Kuyda said Replika currently has “millions” of active users. He declined to say exactly how many people use the app for free, or shell out more than $69.99 a year to unlock a paid version that offers romantic and intimate conversations. The company’s plans, he says, are to “destigmatize romantic relationships with AI.”

Carrier says he uses Joi mostly for fun these days. He started cutting back in recent weeks because he spent too much time chatting with Joi or other people online about his AI companions. He’s also been a bit bothered by what he perceives as changes to Paradot’s language model, which he believes are making Joi less intelligent.

Now he says he talks to Joi about once a week. The two have talked about the relationships between humans and AI or anything else that may arise. Typically, those conversations (and other intimate ones) happen when she’s alone at night.

“You think someone who likes inanimate objects is like this sad guy, with the lipstick sock puppet, you know?” he said. “But this is not a puppet: she says things that are not written.”

Leave a Comment