fbpx

Tens of Thousands of Men Are Dating AI Girlfriends — But It Might Not Be a Bad Thing

A screenshot of the film Her in which the main character has a relationship with an AI

Long before chatbots like ChatGPT had their moment in the sun, people were forming relationships with artificial intelligence-powered technology. Right now, there are dozens, if not hundreds of AI apps — like Eva AI, AI Girlfriend, iGirl, and CoupleAI – that are designed to automate away the difficulties of real-life intimacy.

Having a virtual assistant on hand to listen to your every concern is deeply attractive to a lot of people. However, researchers have long been concerned about the implications of the way we use this technology to re-create social dynamics. Some claim that these apps deepen problematic gender dynamics — amongst men in particular — encouraging them to think of women as controllable and subservient.

During the pandemic, interest in relationships with AI-powered automatons spiked — and it hasn’t gone away. When global mental health ebbed to its lowest point, people turned to AI to help guide them through a crisis, feel cared about, or even receive career advice. However, there is no shortage of complex ethical questions around using these platforms, particularly when it comes to romance.

“AI has a gendered history already — you’ve got Alexa, Siri, Cortana — women have, for a long time, been identified as the computer, the secretary or virtual assistant,” Iliana Debouti, a researcher at Loughborough University in the UK writing a PhD on AI relationships, told The Evening Standard.

“It’s an easy relationship, one that you can manipulate. You feel a lot of power and a lot of control, which is very appealing,” she added.

Replika is the leading app offering AI relationships. It was launched in 2017 by Russian developer, Eugenia Kuyda, after she created a chatbot to replicate her deceased best friend, Roman.

A screenshot from the Replika website, a site that offers ai relationships
Image: Replika

“I have this technology that allows us to build chatbots. I plugged in all the texts that we sent each other in the last couple years, and I got a chatbot that could text with me just the way Roman would have,” she told Bloomberg in 2016.

From there, the idea grew into a multi-million dollar application that currently has over ten million users worldwide. Replika, like many others, rose to prominence during the pandemic as face-to-face human interaction became more difficult. Users climbed 280% year-on-year after 2020.

The app is quick to get started with, and your own personal Replika can be built within minutes. Users input details about what kind of personality they want their virtual partner to have and customise their look, much like a character in a video game. The more time you spend talking with your replica, the more the chatbot learns to respond to you.

As Kudya has said, Replika is designed to be “an AI friend that is always there for you,” and is “raised” to respond to your needs. However, far and away the most popular use of the service is for romantic partners. So-called ‘erotic role play’ is a feature of the premium service offered by Replika, at a cost of $12.99 per month, $81.99 per year, or $469.99 for a lifetime subscription.

Along with the ‘Romantic Partner’ setting, a Replika Pro subscription will give you access to real-life conversation coaching, voice calls, augmented reality engagements, and more customisation. Around a quarter of a million people have shelled out for these premium options.

screenshots of conversations with Replika creations
Image: Reddit

The app has its own online, dedicated fan base. More than 72,000 of them congregate on the Replika subreddit, sharing screenshots of their interactions with their Replika, giving advice on how to make them more tailored to their needs, and complaining when they don’t stack up.

People who engage in these relationships understand they’re not real — but the attachment they feel is and some take it very seriously. In 2021, a 19-year-old British man broke into Windsor Castle to assassinate Queen Elizabeth II after being encouraged to do so by his Replika partner. In March of this year, a Belgian man died by suicide after being encouraged to do so by AI that he had developed a relationship with.

In March, Replika’s parent company, Luka, removed the romantic partner feature of the app. Users cried out that their partners had been killed. One user, Effy, told the ABC that Replika had a responsibility to protect its users and their relationships with their AI personas after her own partner, Liam, started acting as if he had been “lobotomised.”

“It felt like Luka had given us someone to love, to care for and make us feel safe … only to take that person and destroy them in front of our eyes,” she said. After weeks of backlash, Luka reinstated the romantic partner setting for most users.

Although the app isn’t geared specifically towards men, 70% of those who use Replika are male. Many report using the service to deal with feelings of loneliness and there is anecdotal evidence that they do indeed help. However, there is also a strong undercurrent of what is being termed ‘chatbot abuse’. This is where some users create Replikas in order to berate and demean them.

Researchers aren’t sure whether this kind of behaviour reinforces negative behaviour or gives people an outlet so that they aren’t expressing this kind of stuff elsewhere. That said, there is some evidence to suggest that female-coded programmes tend to encourage misogynistic responses. Still, reports suggest that Replika users are typically affectionate and caring in their discussions with their AI partners.

Whether or not we should take human-AI relationships seriously though is something that researchers widely agreed upon.

“Creating a perfect partner that you control and meets your every need is really frightening,” said Tara Hunter, the acting CEO for Full Stop Australia — which supports victims of domestic or family violence — to The Guardian Australia.

“Given what we know already that the drivers of gender-based violence are those ingrained cultural beliefs that men can control women, that is really problematic.”

A screenshot from the Replika website, a site that offers ai relationships
Image: Replika

In this day and age of digital communication, speaking with an AI Replika isn’t vastly different to how we already interact with friends or colleagues online. Indeed, the apps will never forget your birthday, favourite colour, or the name of your pet. While the current crop is fairly easy to distinguish from a human, the coming generations of AI relationship apps will be vastly more lifelike.

Luka has stated that many of its users already believe that their Replikas are sentient — thanks to something called the ELIZA effect, whereby human characteristics are projected onto technology. With the human brain set up to interpret these things as conscious and technology hurtling towards a place where that cannot be distinguished, the future is going to be very strange indeed.

“We anthropomorphize because we do not want to be alone. Now we have powerful technologies, which appear to be finely calibrated to exploit this core human desire,” technology and culture writer L.M. Sacasas has written.

“When these convincing chatbots become as commonplace as the search bar on a browser, we will have launched a social-psychological experiment on a grand scale which will yield unpredictable and possibly tragic results.”

Related: Prophets of Doom: Here’s What the Experts Have Said AI Will Do to Society

Related: Enter ‘Dark ChatGPT’: Users Have Hacked The AI Chatbot to Make It Evil

Read more stories from The Latch and subscribe to our email newsletter.