Your Girlfriend Might Not Be Real

Artificial intelligence, or AI, might be able to generate its own artwork, prose, and even music, but it also might be able to replace your romantic partners.

One influencer has found a way to use AI to become the girlfriend of over 1,000 people, using a voice-based chatbot that is a near-clone of herself.

Caryn Marjorie, a 23-year-old influencer with 1.8 million followers on Snapchat, charges her followers up to $1 per minute to interact with her digital self, CarynAI. Trained by thousands of hours of recordings of the real Marjorie, AI company Forever Voices' CarynAI is capable of mimicking her to a convincing degree, playing the role of virtual girlfriend to Marjorie's customers by chatting about future plans, sharing intimate feelings and sometimes even flirting sexually.

man in love with robot
Stock image of a man in love with a robot. An AI-powered chatbot named CarynAI may be the first in a surge of AI girlfriends to many. ISTOCK / GETTY IMAGES PLUS

According to Fortune, CarynAI initially launched on the Telegram app as a private, invite-only beta test, but is soon to be available to the masses. Newsweek has asked Forever Voices for comment by email.

So, why might AI partners be a future balm for lonely people across the world, reminiscent of the AI Samantha from the Spike Jonze film Her?

"Existing research on the motivations behind the use of chatbots or robots reveals that many of these motivations align with those for having relationships with humans. People often seek these technologies as companions or to have novel sexual and romantic experiences. It is important to note that contrary to popular belief, loneliness does not appear to be a major factor associated with the use of these products," Joris Van Ouytsel, an assistant professor of digital interpersonal communication at Arizona State University, told Newsweek.

"A few years ago, my colleague and I conducted an exploratory study where we let participants engage in sexually explicit conversations with a chatbot. It's worth noting that the chatbot used in our study was not as advanced as the current AI-driven chatbots. We divided the participants into two groups: one group was told that they were chatting with a human, while the other group was told they were chatting with a chatbot (both were in fact chatting with a chatbot)," he said.

Surprisingly, they found no significant difference between the two groups in terms of enjoyment, arousal, or emotional response.

"This implies that during sexting conversations, whether one is interacting with a chatbot or a person may not have a substantial impact on the overall experience," Van Ouytsel said. "However, participants did express frustration with the unrealistic and artificial nature of the chatbot's messages. This suggests that the quality of the messages, such as their pacing or tone, rather than the awareness of interacting with a robot, can significantly affect our experience when using these types of products. As the current chatbots are very realistic in nature, people may genuinely enjoy the conversations as much as with a human."

person talking to chatbot
Stock image of a person talking to a chatbot online. ISTOCK / GETTY IMAGES PLUS

The reason that we are drawn to interacting with chatbots like this, even though we know they aren't a real person, is likely linked to our tendency to anthropomorphize, or project human qualities on non-human objects

"That's a real risk with some of the generative AI tools: they can easily prey on that tendency," Nir Eisikovits, a professor of philosophy and ethics at UMass Boston, told Newsweek. "If you combine that tendency of ours with technologies that sound and look human (say chatGPT and a deep-fake trained on hours of actual video, or chatGPT and an actual Ameca robot that has believable facial expressions) you are certainly looking at people developing attachments to non-human entities. We have been known to humanize cars, pets, storms—you name it. Just imagine how attached we can become to non-human objects that actually behave like humans."

If AI romance catches on, it could be a burgeoning market. CarynAI already generated $71,610 in its beta phase, and is hoped to make $5 million per month, assuming that 20,000 of her 1.8 million-strong fanbase become paying customers.

However, the adoption and reach of these technologies will be significantly influenced by the stigma attached to using virtual companions.

"Currently, there are social stigmas associated with forming relationships with AI," Van Ouytsel said. "However, if this stigma diminishes in the coming years, we can expect to see a broader adoption of these technologies. Similar to how online dating was once taboo but gradually became more accepted, we may witness a similar shift in attitudes toward AI in the near future. This shift could result in an expanding market and increased adoption by users."

Additionally, there are concerns that this form of AI partner isn't entirely ethical, and may cause those who use them to form unhealthy ideas of what a relationship really is.

"One of the more concerning elements is the commodification of relationships using AI tools. As the crisis of loneliness grows, corporations will continue to see this as a market to be filled with temporary solutions such as AI partners," Alec Stubbs, a postdoctoral fellow in philosophy and technology at UMass Boston, told Newsweek.

"Another way that this is disheartening is that it gives us a false sense of control over those that we are in relationships with. I worry that our relationships with AI partners reflect unhealthy relationships that are built on control and domination. One's AI partner can be programmed to attend to specific needs and not others. It can be programmed to only serve and never demand. But what it means to relate to others is to recognize the infinite demandingness of being a social creature—what we owe others matters as much as what is owed to us. Reciprocation is a cornerstone of human relationships," Stubbs said.

He continued: "An additional worry is that we come to view AI partners as replacements for rather than supplements to our relationships with humans and other sentient creatures. In doing so, we potentially risk viewing relationships with sentient creatures as one-way streets, that the purpose of a relationship is to fulfill my personal wants and desires. In truth, we relate to each other in complex ways, and our relationships require cooperation, commitment, the adjudication of competing desires, and the elevation of others' life projects."

Do you have a tip on a science story that Newsweek should be covering? Do you have a question about AI? Let us know via science@newsweek.com.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


Jess Thomson is a Newsweek Science Reporter based in London UK. Her focus is reporting on science, technology and healthcare. ... Read more

To read how Newsweek uses AI as a newsroom tool, Click here.
Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go