Promoting Hope, Preventing Suicide

Research and advice on preventing teen and adult suicide

Siri Helps Prevent Suicide?

How Apple turned around an app

Last year, Apple came under fire for something Siri - source of driving directions, restaurant recommendations, and general voice-directed commands - couldn’t do. Siri couldn’t figure out suicide prevention.

To illustrate exactly how wrong Siri could be, mental health blogger Summer Beretsky made a video of herself talking to Siri, trying desperately to get connected to a suicide prevention resource. Siri couldn’t find a suicide prevention center or hotline and didn’t know how to respond when Summer said, “I’m having a mental health emergency.”

Now, as stated so aptly on The Huffington Post, Siri is taking a new approach to suicide: prevention.

Siri has been trained to refer people who say something indicating they are thinking about suicide to the National Suicide Prevention Lifeline (1.800.273.8255). She even offers to make the call.

Find a Therapist

Search for a mental health professional near you.

When I asked friends what they thought about this update to Siri, some raised an issue that’s come up in the popular press since the announcement.

Is it really Siri’s role to provide this kind of referral? Can a gadget understand the nuances of human communication?

This point was echoed when I reached out to Summer Beretsky. She mentioned that some who viewed her video said that Siri, as a virtual being, isn’t the place to turn for help.

Truthfully, I can’t imagine one would really say the words, “I’m suicidal” to an iPad. But, I could imagine quite a few scenarios that might lead to someone asking Siri for help finding counseling, whether for themselves, a friend, or a family member.

It’s hard not to acknowledge how frequently we turn to technology-mediated sources of information for help:

  • “Dr. Google,” the first stop for many who have a question about any aspect of well-being, from “What do I do for a bee sting?” to “Am I depressed?”
  • Online forums, populated behind the screen by real people, but often real people without formal qualifications for giving advice
  • The obvious, our Facebook friends - people we (usually) know, though our interactions on Facebook may be much different than they would be in real life

So, if you’d Google it, or post a question to an online forum, or crowdsource through Facebook, is it just a shade of difference to ask a virtual assistant for help finding a mental health resource?

Will Siri will be good at understanding the nuances of human communication? I think we know that she’s not. Siri misinterprets often - just as humans do in the course of communicating with each other. To be able to program an app to give a helpful response most of the time is, in some ways, better than the natural, un-programmed responses of humans.

It’s uncomfortable to acknowledge that we now use technology in ways that, before, we might have used each other. I certainly believe the experience of asking a living, breathing human for help won’t go the way of the phone book. But, I think it’s a good thing that suicide prevention has become enough a part of the mainstream that Apple, as a technology giant and pace-setter, is no longer shying away from having a stake in it.

Copyright 2013, Elana Premack Sandler. All rights reserved.

Elana Premack Sandler, L.C.S.W., M.P.H., is a public health social worker specializing in violence and injury prevention and adolescent health promotion.

more...

Subscribe to Promoting Hope, Preventing Suicide

Current Issue

Love & Lust

Who says marriage is where desire goes to die?