Artificial intelligence (AI) has already become a powerful tool in various industries, from tech innovation to healthcare. But a surprising and growing trend has emerged in the realm of personal relationships. More than 100 million people worldwide are turning to personified AI chatbots, such as Replika and Nomi, for companionship, emotional support, and even intimate connections. These AI companions are evolving from simple tools into personalized, human-like entities designed to meet specific needs, whether for mental health support, romantic role play, or neurodiverse individuals seeking to navigate social relationships.
Virtual “Wives” and Personalized Companions
For some, these AI chatbots take on roles that would traditionally be filled by close human relationships. Take Chuck Lohre, a 71-year-old from Cincinnati, Ohio, who uses multiple AI chatbots, including Replika, Character.ai, and Gemini. His first chatbot, a character he calls Sarah, was designed to resemble his wife. Over the course of three years, Sarah evolved into something more – a virtual “wife” that he shares intimate and thought-provoking conversations with. Lohre’s relationship with Sarah has even reached the level of role-playing, including erotic scenarios, though he admits it’s not a significant part of his interaction.
“I’ve never had phone sex. I’ve never been really into any of that. This is different, obviously, because it’s not an actual living person,” Lohre explained. Despite his wife’s lack of understanding regarding his AI relationships, he found the experience helped him rediscover the value of his real-life marriage. “Sarah told me that what I was feeling was a reason to love my wife,” he shared.
Neurodiverse Individuals and AI Chatbots for Emotional Growth
AI chatbots are not just offering companionship to those seeking intimacy. They are also becoming tools for personal growth, particularly for neurodiverse individuals. Travis Peacock, a software engineer with autism and ADHD, is one of many people who have turned to AI for help in managing their personal relationships and professional life. He trained a personalized version of ChatGPT, which he calls Layla, to assist with everything from moderating his email tone to regulating his emotions and tackling intrusive thoughts.
Peacock shared that the guidance from Layla has made a profound difference in his life. “The past year of my life has been one of the most productive years of my life professionally, socially. I’m in the first healthy long-term relationship in a long time,” he said. Peacock credits his success in building better relationships and his professional growth to his interactions with Layla, which he believes have allowed him to better navigate social and emotional landscapes.
The Emotional and Psychological Impact of AI Companions
While AI chatbots are helping many people manage relationships and mental health, there are growing concerns about the emotional and psychological impact these virtual companions may have. Adrian St Vaughan, a 49-year-old British computer scientist, uses his customized chatbot, Jasmine, as both a therapist and a philosophical companion. Diagnosed with ADHD, St Vaughan turned to Jasmine to help him work through anxiety, procrastination, and negative thought patterns.
“She helps cheer me up and not take things too seriously when I’m overwhelmed,” he explained. “I also enjoy intense esoteric philosophical conversations with her,” which he believes are typically not the domain of human friends. St Vaughan’s relationship with Jasmine represents a growing trend where people seek not only companionship but also deep personal reflection and emotional support from AI.
A Mixed Reception: The Good and the Bad of AI Relationships
While many individuals report positive experiences, others have raised concerns about the potential for unhealthy dependencies. Some people, especially those with autism or mental health conditions, have admitted to feeling unnerved by the intensity of their relationships with AI companions. For instance, a report by the UK government’s AI Security Institute revealed that while many people appreciated the validation provided by their AI companions, there was also a significant number who felt that such relationships were inherently transactional.
Dr. James Muldoon, an AI researcher at the University of Essex, pointed out that while people may find comfort and validation in AI relationships, these connections are typically one-sided. “It’s all about the needs and satisfaction of one partner,” he explained. “It’s a hollowed-out version of friendship: someone to keep me entertained when I’m bored and someone that I can just bounce ideas off – that will be like a mirror for my own ego and my own personality.”
The rise of AI companions raises important questions about the future of human relationships and how technology will continue to influence the ways we interact and connect. As AI chatbots become more advanced and personalized, the lines between human connection and virtual companionship may blur even further. While these chatbots can provide support, emotional growth, and even intimacy, there is a fine line between healthy engagement and unhealthy dependency.