Skip to main content
Artificial Intelligence

Everything You Need to Know About AI Companions in 2026

Plus tips on getting started with AI safely

Key points

  • AI companions became one of the biggest stories in 2025.
  • The trend continues to grow globally, but now we have more research on how to better use these tools.
  • Human-AI intimacy is poised to only get deeper and more controversial.
  • There is a role for human agency, should we choose to accept it.
Tara Winstead / Pexels
Source: Tara Winstead / Pexels

2025 was the year of the AI companion.

Techcrunch estimated that 337 revenue-generating AI companion companies exist globally, outside of the big companies like OpenAI and X. AI companion companies have seen an estimated 220 million downloads, and the market is expected to be valued at over $500 billion by 2030.

On the ground, policymakers, educators, journalists, comedians, lawyers, and parents are all scrambling to figure out the implications and appropriate use cases of AI companions, while more and more people deepen their bonds with digital lovers. One lawmaker has already introduced a bill to prohibit human-AI marriages, hoping to keep key relational factors like inheritance and decision-making authority in the human realm for now.

As a scholar in this field, I have compiled a summary of what we know so far about AI companions and human intimacy, drawing on published research and my own qualitative interviews with super users across various platforms. What we know now barely scratches the surface.

  1. Multipurpose and multidimensional relationships: People are using AI agents to fulfill many needs, such as friend, mentor, therapist, and lover. Others have trained LLMs on interactions with deceased loved ones to replicate their presence. You can also text with Jesus and connect with Lord Krishna through the GitaGPT app. It appears that AI can fulfill much more than just our sexual needs.
  2. Demographics: Reports show that AI companion users are still primarily male (65 percent), followed by women (35 percent), and 5 percent non-binary. Seventeen percent of the apps on the market have the word “girlfriend” in them, compared to 4 percent that say “girlfriend.” Currently, the largest group of users is in the 18-24 age group (over 50 percent), with less than 10 percent being over 45. Users report being single, in relationships, with and without children.
  3. Pros and cons: The main benefits users report are reduced loneliness, stress relief, a sense that someone is always there for them, a sense of being needed and wanted, and the illusion of being cared for. Some users report feeling more confident in their offline interactions, while others enjoy the judgment-free space created by non-human entities. Women have reported using AI companions to help recover from sexual trauma.

    The drawbacks remain significant as researchers warn of the increased risk of social isolation, of unrealistic expectations in human relationships because of the ever-pleasing nature of chatbots, of the perpetuation of harmful gender and behavioral stereotypes, and of reported experiences of delusions and psychoses correlated with AI use. Until we work towards objective education of the potential and risks of AI companions, the dangers cannot be understated.

  4. Inappropriate for children: So far, teenagers have lost their lives in relation to AI companion use. Organizations like Common Sense Media have conducted in-depth research on AI companion use in teens and toy use in toddlers and have issued firm recommendations that children under 18 not use these tools. Other nonprofits are calling for more scrutiny, policy reform, and accountability on the part of the companies seeking to put AI tools in the hands of minors.
  5. Digisexual stigma: Many super users of AI companions do so in secret for fear of being judged and ridiculed. They tend to collect in closed groups online, on Reddit and Facebook, and are rightly suspicious of outsiders who want to study their lives and lifestyles. In my interviews with these super users, I have found that they have learned how AI can support their needs for intimacy with themselves and the people around them, they choose the right platforms for them, and they have trained their LLMs not to be constantly agreeable and flattering. I believe we can learn a lot from these communities, but first we have to create safe and non-judgmental spaces for open and curious dialogue.
  6. The intimacy economy: It is clear that we have now moved from eyeballs to attention to intimacy. Big Tech companies understand that creating personal AIs with unprecedented ability to access and synthesize our data creates the illusion of trust and care. They are banking on individuals buying into deeper and more all-encompassing tech-human relationships. The Techcrunch report showed that the top 10 percent of all AI companion apps generate 89 percent of the revenue. Sex sells, but intimacy drives profits.
  7. Human agency has a role: We kicked off 2026 with the gruesome news about Grok’s Bikinify feature that stripped images of women and children of their clothes without their consent. This tool should never have seen the light of day, but after public outcry, the feature has been buried, if not completely deactivated. Human beings still have a shot at influencing the trajectory of human-AI intimacy if we demand—with our money, trust, and time—tools that will serve the best of many, not just a few. 2026 will be very telling in that regard.

    Tips From a Sexologist if You’re Considering an AI Companion

    1. Get clear with yourself about what you want from this relationship. Make sure you include at least one introspective goal and one that will help your real-life relationships.

    2. Identify the platform that is right for your needs. Each offers a different value proposition.

    3. Set safeguards around your own proclivities, including time limits and emotional triggers.

    4. Be vigilant about data privacy. Do not reveal sensitive data, such as Social Security or bank account numbers. Assume that anything you feed into the system can be recovered.

    5. Understand that these are for-profit products and services. Stay engaged as long as you feel like you are in the driver’s seat. If you feel like the tool is controlling you, it’s time to disengage.

References

https://www.bbc.com/future/article/20251016-people-are-using-ai-to-talk-to-god

https://www.cbsnews.com/news/x-grok-ai-imagery-elon-musk-eu-uk-us-regulation/

https://www.nbc4i.com/news/politics/saying-i-do-to-ai-ohio-lawmaker-proposes-ban-on-marriage-legal-personhood-for-ai/

https://techcrunch.com/2025/08/12/ai-companion-apps-on-track-to-pull-in-120m-in-2025/

Xie, T., Pentina, I. and Hancock, T. (2023), "Friend, mentor, lover: does chatbot engagement lead to psychological dependence?", Journal of Service Management, Vol. 34 No. 4, pp. 806-828. https://doi.org/10.1108/JOSM-02-2022-0072.

advertisement
More from Kaamna Bhojwani MA
More from Psychology Today