Skip to main content

Verified by Psychology Today

Artificial Intelligence

Spending Too Much Time With AI Could Worsen Social Skills

New research suggests that emotional dependence on AI has potential downsides.

Key points

  • Too much time with AI chatbots could worsen social skills and interactions with people.
  • Human-AI relationships often are one-sided and focused on human needs rather than emotional engagement.
  • While AI chatbots can reduce loneliness, time spent with AI should be balanced with socializing with people.
Ron Lach / Pexels
Source: Ron Lach / Pexels

AI social chatbots can help reduce loneliness, especially for people who would otherwise have little access to social support. In one recent survey, three percent of students who used Replika, a social chatbot, reported that Replika helped them stop suicidal thoughts. However, new research suggests that developing too much emotional dependence on AI social chatbots could have a dark side—potentially worsening your social skills with people.

The new study of 496 Replika users found that higher satisfaction and emotional interaction between users and their chatbot was linked to worse real-life interpersonal communication. Many users turned to Replika for entertainment and for social purposes and to have their own emotional needs met. Interacting with AI social chatbots regularly had the most impact on user emotions, with less impact on their cognition and behavior.

Emotional Experiences With AI May Hinder Real-World Engagement

Users of AI chatbots, even social ones, are not expected to grasp how AI entities feel, and the relationship is often one-sided, centered around the needs of the human user. Researchers found that Replika users primarily focused on satisfying their emotional needs, rather than emotional engagement. This design is encouraged by AI companions, which are built to foster connection by addressing the emotional needs of the user.

Despite the tendency to anthropomorphize AI, users are aware that AI lacks sentience or feelings. This lack of reciprocal emotional engagement is significant, as it does not accurately mirror real-life human interactions, which require mutual emotional involvement. Exclusive emotional dependence on AI could harm users' relationships and interactions with humans, as users may not develop the ability to address the complex feelings of the other.

In contrast, negotiating needs, dealing with conflict, and understanding the emotional state of another person, referred to as mentalization, are essential social skills when navigating human relationships. Building relationships in real life requires two-way emotional engagement, including disruption and repair. Becoming emotionally dependent on social AI agents at this point does not accurately mirror building a two-way emotional relationship with another human.

Even if the AI agent was designed to mimic complex human emotional needs and responses, the problem is that most human users are aware that the actions are mimicry and not actually felt by AI. Furthermore, the convenience of engaging with AI, which does not have its own time boundaries or emotional needs, could mean that users will tend to choose time with AI over putting in the effort to schedule and navigate more complex human relationships. If users are getting their emotional needs partially or fully met through AI companions, this may decrease the motivation and incentive to reach out to interact with other humans.

Emotional Dependence on AI Could Change How We Interact With Humans

Human-AI relationships can serve a meaningful purpose, but becoming emotionally dependent on them exclusively would have concerning impacts on human interactions.

Media dependence, a term first defined by researchers Melvin Defler and Sandra Bower-Killoch in their 2004 paper "The Dependence Mode of Mass Communication Media Effect," refers to the codependence of media, audience, and society. The two types of media dependence are habitual dependence and spiritual dependence. An example of habitual media dependence is compulsive smartphone use. Spiritual dependence refers to the anxiety and emptiness people experience without their phones, a phenomenon known as nomophobia or "no phone phobia."

Similarly, our growing reliance on AI chatbots and agents could create emotional dependence with psychological consequences.

The optimal amount of time with AI chatbots and agents remains an open area of research. A key factor is whether people are balancing time with AI with real-life human socialization.

The answer is likely nuanced. While AI-powered chatbots and social agents may reduce loneliness, especially for those with little access to other social outlets, connection with other humans is nonfungible. It is crucial to balance AI interactions with time with people. Spending too much time relying on AI can diminish essential social skills, even though AI can offer valuable support when used in moderation.

Marlynn Wei, MD, PLLC © Copyright 2024

References

Maples B, Cerit M, Vishwanath A, Pea R. Loneliness and suicide mitigation for students using GPT3-enabled chatbots. Npj Ment Health Res. 2024 Jan 22;3(1):4. doi: 10.1038/s44184-023-00047-6. PMID: 38609517; PMCID: PMC10955814.

Yuan Z, Cheng X, Duan Y. Impact of media dependence: how emotional interactions between users and chat robots affect human socialization? Front Psychol. 2024 Aug 16;15:1388860. doi: 10.3389/fpsyg.2024.1388860. PMID: 39220396; PMCID: PMC11362029.

advertisement
More from Marlynn Wei M.D., J.D.
More from Psychology Today