Skip to main content

Verified by Psychology Today

Relationships

Exploring Relationships With Non-Human Partners

The scholarship behind making avatars and robots more compelling lovers.

Key points

  • Research offers insight into the human tendency to trust and engage with AI and robots.
  • Our willingness to disclose personal information to AI, and empathize with AI, can be manipulated.
  • This information may be used to make artificial lovers more enticing.
Willyam Bradber/Shutterstock
Source: Willyam Bradber/Shutterstock

Did you know that there is substantial scientific literature exploring how to enhance people’s willingness to reveal personal information to artificial intelligence (AI)? There is also research examining the attributes of robots that cause them to be perceived as safe and trustworthy. Are you aware that there's research that studies in which kinds of situations people will follow a robot's lead, and literally mimic the opinions or behaviors of robots? There’s a vast and fascinating accumulating literature that explores how to use AI and robotics to influence (manipulate?) humans. As a sex therapist, my mind often wanders to the ways this knowledge will eventually be utilized to turn sex tech into more compelling sex partners.

Much of this research is not intended for use in sex tech. In fact, two burgeoning areas of research are focused on how to modify robots and AI (the “brain” inside the robot) to be more compelling to children in educational environments and elderly people in need of assistance and companionship. Studies explore everything from AI voice tone, to amount of eye contact made with a human user, to words and phrasing that facilitate human trust and openness. Researchers study what visible characteristics of robots minimize the “uncanny valley,” or the creepy feeling some people experience when they see a robot sounding or behaving in a humanoid way. Obviously, the more engaging and enticing robots become, the more useful they will be in a wide variety of roles with people of all ages – including the role of artificial lover.

Here are just a couple of studies that may ultimately offer insight into the creation of more appealing AI and robotic lovers.

Self-Disclosure and AI

Information is power, and disclosing personal information into the wrong hands can have profound implications for privacy and even safety. Interestingly, research suggests that at least in some circumstances, people may feel more comfortable sharing personal information to AI than they would another human being. One variable in this equation appears to be the fear being judged. For example, in one study where people were given the option to disclose information to an avatar or a human, the sensitivity of the topic determined their choice (Pickard et al., 2016). That is, they chose to disclose to an avatar their more personal information. A second study evaluated the benefits from self-disclosure to a chatbot versus a human (Ho et al., 2018). Interestingly, the benefits of emotional disclosure were equivalent, regardless of whether the recipient was real or artificial. As a sex therapist, I wonder how artificial lovers may facilitate sexual conversation and behaviors, providing a feeling of security (and perhaps other benefits) that people may be unable to experience with human partners.

We Feel Empathy for AI

Here are two interesting examples of research exploring the experience of empathy, or attributing some level of mentation, toward AI: In one study, research subjects were given a 20-minute tour of a lab, led by either a socially compelling humanoid robot, a human, or a humanoid robot interacting in a rudimentary (unsocial) way. In all conditions, the leader asked participants to keep a personally damaging secret from the head experimenter at the end of the study. Research subjects tended to collude with the request and keep the secret from the human leader and the socially adept robot, but not for the more mechanical robot. The researchers suggested that people form “intimate and trusting” psychological relationships with robots exhibiting more social cues (Kahn et al., 2015).

An additional study compared empathy towards humans and robots using fMRI results. These researchers reported results similar to the last study – this time from a neurophysiological perspective. When viewing videos demonstrating affection between humans and videos showing affection between a human and a robot, subjects’ fMRI results were similar (Rosenthal-von der Pütten et al., 2014). Thus, not only do humans experience feelings of empathy toward robots in certain circumstances, but their fMRI results prove it. As a sex therapist, I consider how people may ultimately feel empathy and caring for their artificial lovers.

We Mimic a Robot’s Behavior in Group Settings

Another fascinating area of research is how people in groups respond to a non-human group member. For example, in one study, a robot played a collaborative game with three human participants. The robot either verbalized vulnerable statements such as acknowledging an error or consoling other team members, or it made emotionally neutral, factual statements (Strohkorb et al., 2018). Interestingly, in groups where the robot expressed more vulnerability, group members not only interacted more humanely toward the robot, but they also interacted in more trusting ways with each other, such as smiling together and laughing. This effect was not seen in the groups where the robot made neutral statements. Thus, the robot’s behavior influenced the tenor of group interaction. As a sex therapist, I wonder how an AI lover’s vulnerable words and behaviors may entice human partners to feel and behave in increasingly vulnerable ways themselves.

Of course, time will tell how compelling artificial lovers will become. I imagine that they will be more enticing for some people than others. However, trends today suggest that people are having less sex, and dating less, than in decades past. I wonder if these trends will enable artificial lovers more opportunity for human connection. At the moment, we can only speculate about the future of human intimacy.

References

Ho, A., Hancock, J. & Miner, A. (2018). Psychological, Relational, and Emotional Effects of Self-Disclosure After Conversations With a Chatbot, Journal of Communication, 68, (4), 712–733, https://doi.org/10.1093/joc/jqy026

Kahn, P., Kanda, T., Ishiguro, H., Gill, B., Shen, S., Gary, H. & Ruckert, J. (2015). Will People Keep the Secret of a Humanoid Robot? Psychological Intimacy in HRI, Association for Computing Machinery, New York, NY, USA, https://doi.org/10.1145/2696454.2696486.

Pickard, M., Roster, C. & Chen, Y. (2016). Revealing sensitive information in personal interviews: Is self-disclosure easier with humans or avatars and under what conditions? Computers in Human Behavior, 65, 23-30, 10.1016/j.chb.2016.08.004

Rosenthal-von der Pütten, A., Schulte, F., Eimler, S., Sobieraj, S., Hoffmann, L., Maderwald, S., Brand, M. & Krämer, N. (2014). Investigations on empathy towards humans and robots using fMRI, Computers in Human Behavior, 33, 201-212, https://doi.org/10.1016/j.chb.2014.01.004.

Strohkorb S., Traeger M., Jung M., Scassellati B. (2018). The ripple effects of vulnerability: The effects of a robot’s vulnerable behavior on trust in human–robot teams. In Proceedings of the 2018 ACM/IEEE International Conference on Human–Robot Interaction, 178–186. ACM.

advertisement
More from Marianne Brandon Ph.D.
More from Psychology Today