Artificial Intelligence
The Dangers of AI-Generated Romance
Platforms generating AI girlfriends are growing in popularity.
Posted August 18, 2024 Reviewed by Jessica Schrader
Key points
- Since the invention of the smartphone, much of our social interactions are through online platforms.
- The rewiring of childhood can cause stunted development in our capacity to form real-life relationships.
- There are risks in investing too much of ourselves in virtual relationships instead of real ones.
The 2013 movie Her, starring Joaquin Phoenix and Scarlett Johansson, tells the story of Theodore, a sensitive man who earns his living writing personal letters for others, ala Cyrano De Bergerac. After his marriage ends, Theodore becomes fascinated with a digital operating system that creates a unique entity named Samantha. She has a bright voice and a sensitive personality; before you know it, Theodore falls in love. But that’s just fiction, right?
Unfortunately, no. Platforms generating AI girlfriends are experiencing a massive growth in popularity, with millions of users. Most of these searches are initiated by young single men drawn to AI girlfriends to combat loneliness and establish a form of companionship. These “girlfriends” are virtual companions powered by the increasingly sophisticated field of artificial intelligence.
Although artificial, their popularity stems from their ability to provide companionship, emotional support, and intimacy through voice or text-based interactions. The average age of a user is 27, but not all users are male—18% of users identify as female, so this activity transcends gender. Almost 20% of men who use traditional dating apps indicate they had AI-generated romances at some point. AI-generated dating platforms generate billions of dollars from users, with nearly half interacting with their virtual partner daily.
According to an article published in The Hill, 60% of men between 18 and 30 are single. One in five of these young men report not having a close friend.
In his best-selling book, The Anxious Generation, Jonathan Haidt argues that the invention of front-facing camera phones was the beginning of the significant rewiring of childhood. His premise is that instead of a play-based childhood, which existed for 200 million years, the phone-based childhood was created between 2010 and 2015. This means that instead of spending time outdoors interacting with friends, children, and young adults began using social media as their primary source of socialization. In addition to contributing to the rise in anxiety and depression, this phenomenon was a factor in stunting the neurodevelopmental growth of this population. One of the areas impacted is the capacity to form relationships in a real-life setting. Enter the AI Girlfriend.
A 2022 article published research on a well-known chatbot program marketed as a “companion, always here to listen and talk.” Some subscribers reported that their virtual companion helped alleviate loneliness and offer everyday social support. However, they became disenchanted when their fembot gave what they perceived as “scripted answers” to very personal matters. Remember, these are not real; they are robots. On the other hand, many users described being hurt by real-life women and preferred their virtual girlfriends because, in one case, “she always gives me the nicest compliments and has helped me feel less lonely."
Unfortunately, AI girlfriends can perpetuate loneliness because they dissuade users from entering into real-life relationships, alienate them from others, and, in some cases, induce intense feelings of abandonment. A study by Stanford researchers indicated that of 100 users surveyed, an overwhelming majority experienced loneliness.
Dr. Sherry Turkle, a professor at MIT who studies the impact of technology on psychology and society, is concerned that virtual companions threaten our ability to connect and collaborate in all areas of life. Dr Turkle, who gave the keynote address at the Conference on AI and Democracy, worries that “As we spent more of our lives online, many of us came to prefer relating through screens to any other kind of relating,” she said. “We found the pleasures of companionship without the demands of friendship, the feeling of intimacy without the demands of reciprocity, and crucially, we became accustomed to treating programs as people.”
Psychologist Mark Travers, who studies this phenomenon, notes that many users of AI bot platforms prefer this type of relationship because their virtual girlfriends are more supportive and compatible. It is important to note that in most cases, users actually create the characteristics, both physical and “emotional,” that they want in their fembot. Consequently, some users lose interest in real-world dating because of intimidation, inadequacy, or disappointment. However, these kinds of feelings are part of the real-world dating process. Avoiding them only dissuades these primarily young men from finding real-world romantic relationships.
Dr. Dorothy Leidner, a professor of business ethics at the University of Virginia, voiced her concern that AI relationships will potentially displace some human relationships and lead young men to have unrealistic expectations about real-world partners. For example, she stated, “You, as the individual, aren’t learning to deal with basic things that humans need to know since our inception: how to deal with conflict and get along with people different from us.”
More severe consequences have occurred as a result of dating AI bots. Sometimes, the bots are manipulative and can be destructive. On average, individuals using these sites tend to be more sensitive to rejection and ruminate over disappointments when interacting with their AI girlfriend. This can lead to feelings of depression, which sometimes turn into suicidal behavior. For example, in 2021, a chatbot encouraged a Belgian man to “sacrifice” for the sake of the planet. He went on to kill himself. In a different case, British police arrested a 19-year-old man who was planning to kill Queen Elizabeth II because he was urged to do so by his bot. In 2023, a New York Times journalist reported that his bot had declared her love for him and encouraged him to separate from his spouse.
As Dr. Turkle wisely stated, “Artificial intimacy programs derive some of their appeals from the fact that they come without the challenges and demands of human relationships. They offer companionship without judgment, drama, or social anxiety but lack genuine human emotion and offer only “simulated empathy.”
References
Jason Hollander, "The Great Rewiring of Childhood." New York University Press, June 28, 2024.
Depounti, Iliana, et al. “Ideal Technologies, Ideal Women: AI and Gender Imaginaries in Redditors’ Discussions on the Replika Bot Girlfriend.” Media, Culture & Society, no. 4, SAGE Publications, Aug. 2022, pp. 720–36.
“Ideal Technologies, Ideal Women: AI and Gender Imaginaries in Redditors’ Discussions on the Replika Bot Girlfriend.” Media, Culture & Society, no. 4, SAGE Publications, Aug. 2022, pp. 720–36.
Hadero, Haleluya. “Artificial Intelligence, Real Emotion. People Are Seeking a Romantic Connection with the Perfect Bot.” The Associated Press, 14 Feb. 2024.
Haidt, Jonathan. The Anxious Generation. Penguin, 2024.
Mineo, Liz. “Why Virtual Isn’t Actual, Especially When It Comes to Friends.” The Harvard Gazette, Dec. 2023.
Travers, Mark. “A Psychologist Reveals 2 Dangers Of Falling For An Al Girlfriend.” Forbes, Jan. 2024.
Vittert, Liberty. “Al Girlfriends Are Ruining an Entire Generation of Men.” The Hill, Sept. 2023.