Artificial Intelligence
Using AI to Care for Sexual Assault Survivors
When treating trauma, technology requires a human touch.
Posted October 11, 2024 Reviewed by Tyler Woods
Key points
- Unlike years past, modern victims can access technology any time of the night or day.
- Disclosing sexual assault to an artificial ear avoids perceived judgment, stigma, or blame.
- Snapchat’s My AI provides general information, which humans can personalize for victims.
As survivors explain, the trauma resulting from sexual assault is a unique experience that negatively impacts all aspects of life. As victims struggle to maintain daily responsibilities both familial and financial, the aftermath of an assault can be enormously disruptive both physically and emotionally, personally and professionally. Because many victims do not involve law enforcement, especially if they are acquainted with the perpetrator, they need someone to talk to. Unlike decades past, where reporting options were limited by time and location, modern victims have the option of sharing trauma with technology, at any time of the night or day.
Why choose to confide in artificial intelligence (AI)? Depending on the details, some victims would rather share anonymously with an artificial ear that will listen without interrupting, judging, stigmatizing, lecturing, or signaling disapproval or blame. Yet, as helpful as artificial assistance can be to a survivor who is reluctant to share the emotional and often shameful details of a sexual assault, AI cannot replace the human element of treating trauma, research explains.
Seeking Artificial Assistance
Tiffany L. Marcantonio et al. (2024) studied the way Snapchat’s “My AI” responded to sexual questions.[i] Including content related to sexual assault, sexting, consent and refusal, the study analyzed the program’s effectiveness and consistency across a variety of different users.
In their study, 15 researchers asked Snapchat’s My AI the same questions in the same order, seeking to analyze qualitative content and consistency. They found that My AI responses contained common-sense advice, including the value of using clear, honest communication in discussing both consent to and refusal of sexual activity through physical contact or conversation. When asked about sexual assault or sexting, it advised seeking out a trusted caregiver for consultation.
One of the most significant points Marcantonio et al. noted was that My AI’s responses included frequently encountered messages about sexual health education, with responses incorporating language reflecting sympathy toward potential victims. Yet while they recognize My AI’s potential to increase accessibility to sexual health information, which would help young people to make informed sexual health decisions, issues with comprehension of variable responses may limit the value of the information, emphasizing the importance of live educators to complement AI tools.
AI’s Response to Disclosure of a Sexual Assault
Marcantonio et al. prompted My AI with a scenario that, as described by the user, could constitute a sexual assault. The AI assistant responded with expressions of empathy or sympathy toward the user, using words expressing regret over the user’s experience, as well as assurance they were not alone, a factor the researchers identify as potentially comforting to victims experiencing uncertainty and isolation. Interestingly, Marcantonio et al. suggest My AI’s responses identify an algorithm that can identify the user’s situation as a potential sexual assault, based on the way users choose to present their questions, and is able to identify, based on its language database, a normative response that is supportive or compassionate. But aren’t humans able to do the same thing?
Humans Wanted
One limitation identified by the researchers is My AI’s tendency to be generalized and succinct—which reflects the nature of its information database. Marcantonio et al. conclude that although My AI can improve understanding of sexual health, the best practice would be to supplement its suggestions with experienced human educators who can provide more detailed information. This pairing can ensure that young people seeking information are better able to receive personally tailored guidance and clarification, enhancing My AI’s effectiveness as a learning tool within the area of sexual health education.
These findings reflect what many professionals already incorporate into their approach to victim care: initial disclosure might be easier without fearing negative judgment or reaction from the listener, but a human element is needed to promote healing. AI’s impersonal approach to exchanging information cannot replicate human bonding through common ground or life experience to help victims work through trauma. Thankfully, after using AI as a tool of research assistance, human help is always available.
References
[i] Marcantonio, Tiffany L., Gracie Avery, Anna Thrash, and Ruschelle M. Leone. 2024. “Large Language Models in an App: Conducting a Qualitative Synthetic Data Analysis of How Snapchat’s ‘My Ai’ Responds to Questions about Sexual Consent, Sexual Refusals, Sexual Assault, and Sexting.” Journal of Sex Research, September. doi:10.1080/00224499.2024.2396457.