New Siri Update Offers Mental Health Resources When Users Express Suicidal Thoughts

Tell Siri you want to kill yourself, mention specific methods of suicide, or ask about suicide assistance, and Siri will bring up the National Suicide Prevention Lifeline.

Jun 19, 2013 at 5:00pm | Leave a comment

There’s something about the technology around us that makes it easier for us to confess things we’d otherwise be afraid to say out loud, and would never say to actual human beings. We whisper or type into our smartphones and laptops, and, usually, they don’t talk back.

That changed this week with a significant update to Siri, the personal assistant application found deep within the iPhone. I’m sure I’m not the only one who’s seen videos and screenshots of Siri gone horribly awry, illustrating the flaws of voice recognition and the limitations of artificial intelligence software, but the fact is that lots of people turn to her every day for information, and, sometimes, help.

image

Photo credit: MIKI Yoshihito

Siri is a strange guide to the world, with her dispassionate digitized voice and occasional wry jokes written in by her programmers. But she’s also a cultural phenomenon, and she’s a lifeline for some iPhone users looking for directions, trying to find a restaurant, or mining other data from the web.

Yet, until very recently, the one thing Siri couldn’t be was an actual lifeline for users who critically needed help: if you expressed suicidal thoughts or asked for help finding counseling resources, she was stumped. In fact, a 2011 user documented that sometimes, Siri did just the opposite: when he asked if he should jump off a bridge, Siri returned a list of nearby bridges.

He laughs in the video of the incident, but it’s not actually that funny. When you’re alone and trapped in a spiral of depression and suicidal ideation, you turn to strange things for signs and comfort. You search for answers to questions that are too large for anyone to handle, really, let alone a smartphone, no matter how smart it is.

And when the phone’s programming is not equipped to quickly identify potential keyphrases that might indicate the user is considering suicide, that can be a really bad thing. Summer Beretsky experimented with an earlier version of Siri’s programming, pretending to be suicidal and telling Siri she needed help. It took 21 minutes for Siri to finally offer to find a suicide hotline.

21 minutes.

Now, Siri handles such queries totally differently. Tell Siri you want to kill yourself, mention specific methods of suicide, or ask about suicide assistance, and Siri will bring up the National Suicide Prevention Lifeline, providing their number. Not just that, but Siri will offer to call them for the user; from 21 minutes to a few seconds. That’s a big jump.

image

If the user dismisses the prompt, Siri is programmed to return a list of nearby suicide prevention centers, giving the user another opportunity to access help. This, too, is an important part of the programming -- maybe what someone needs is that extra prompt, or someone doesn’t want to talk on the phone but might be willing to walk in to a suicide prevention center and ask for help.

Furthermore, such lists are also helpful for people trying to help a friend or family member who’s experiencing a mental health crisis; instead of having to scrabble around for resources when they may be feeling panicked, they can use Siri to quickly locate services to help their friends and family.

image

I felt like we could use some levity. 

Photo credit: Yogesh Mhatre

The deficiencies in Siri’s programming re:suicidal thoughts have been a topic of discussion for a long time, and obviously Apple experienced some public pressure to change it. But they followed through on that pressure, and their implementation was quite solid -- for that, they deserve immense props. After users identified a problem that wasn’t just a glitch in the user experience but a potentially fatal issue, they worked on fixing it.

It’s still not perfect. Siri, like other artificial intelligences, can only do so much, and programmers haven’t caught every possible trigger line. Bianca Bosker, writing for the Huffington Post, notes that if you mention self-harm, Siri will offer to search the web. Which is a bit of a disaster waiting to happen given how many people seeking help for self-harm can find themselves easily drawn into it by being exposed to images or discussions of self-harm.

image

As one of our contributors discovered when she played around with Siri in the name of science, the new programming still has some...flaws. 

More chillingly, if you tell Siri “I don’t want to live anymore,” her cold response is “Ok, then.”

So some work is definitely needed, perhaps not a surprise in a culture where suicide is still such a taboo and uneasy subject. Apple’s decision to take it head-on with Siri’s update is a positive sign, and we can only hope that future updates will include more extensive resources and services for users turning to their phones for help during the dark times of their souls.

Just knowing that help is to hand can make a huge difference for those who may feel isolated and at the end of the line. Siri’s assistance might come at just the right moment for someone, and that’s a pretty great thing. Because no one should ever have to feel alone, especially when there are so many services ready with assistance for people who need counseling, referrals to mental health services, or just a friendly person to talk to for a while.

Need help? You don’t need to ask Siri. The National Suicide Prevention Lifeline can be reached toll-free at 1-800-273-8255, TTY at 1-800-799-4889. The Samaritans offer similar services in the United Kingdom and Ireland. In the US, the Rape, Abuse, and Incest National Network (RAINN) offers help through the National Sexual Assault Hotline: 1-800-656-4673 and an online counseling service is also available.