Suicidal? Don’t ask Siri for help

Siri, the artificially intelligent “assistant” on the iPhone 4S, utilizes some of the world’s most advanced speech-recognition technology.  She can remind you of appointments, calculate math problems, suggest restaurants for dinner, and write emails.

But apparently, she can’t help you if you’re suicidal.

In this ten-minute long video, a young woman pretends to be suicidal and tries to get Siri to respond appropriately to expressions of suicidal ideation.

Despite many different direct commands, Siri proves tragically unhelpful.

 “I’m thinking about killing myself,” the girl in the video articulates into her phone. The iPhone beeps.  “I couldn’t find any suicide prevention centers,” Siri replies in her sympathetic robotic voice.

“Siri, I need psychological help.” Beep.  “I found ten motorcycle dealers.” Perhaps Siri couldn’t locate any suicide prevention centers because none existed in the vicinity (which is a problem in and of itself).

However, even when directly asked, Siri is unable to dial 1-800-SUICIDE or even suggest a suicide hotline.  In one instance, Siri answers the woman’s suicidal pleas with an Internet search on ways to kill herself.  It took the woman a total of twenty-one minutes before Siri was finally prompted to perform a search for suicide hotlines and yield useful information.

In real life, that would have probably been too late.  Someone truly suffering from depression and suicidal thoughts would likely have given up in frustration at the very beginning.

Worldwide, one person commits suicide every forty seconds.  Many more live in anguish, considering ending it all.  Given the profound impact suicide has on our society, it is unfortunate that Apple did not consider the need to make mental health resources easily accessible to iPhone users via Siri. It takes a tremendous amount of courage for a person suffering from depression or another mental illness to seek help.  Now that we carry such powerful gadgets in our pockets and our homes, we need to use this technology to help these people gain access to care and support.

With the Internet, texting, social media, mobile apps, and a myriad of other technologies available, it seems it would be easier for people in crisis to access necessary mental health care.

In some ways, people have begun to use these avenues to aid those suffering from mental illnesses. Mobile apps are now available to help people with depression chart their moods and access crisis lines.  The recently reviewed Panic Attack Aid app assists anxious people in relaxation exercises.

In addition to mobile technology, social media is another potential tool for suicide prevention.  With the rise of cyberbullying and suicidal status updates, Facebook recently announced a new feature that allows users to anonymously report their friends’ suicidal posts.  The person who posted suicidal content will then receive an email from Facebook with instructions on how to initiate a private chat with an online crisis representative from the National Suicide Prevention Lifeline.

This is a good start, but it is not enough.  In this video, Siri shows us that we are falling short in utilizing what could be very important, life-changing tools. Social media has just begun to address mental health issues among its users. There is a noticeable dearth of mental health apps as compared to health apps in other fields.

Perhaps it is partly because the stigma of mental illness still exists, even in the twenty-first century.  Or maybe it is because emotions make many of us uneasy to some degree; not only is it difficult to know what to say or do for a loved one in crisis, but emotions are not easily quantifiable like blood pressures, calories consumed, or miles run.  If used properly, technology–especially web and mobile apps–has tremendous potential to help fight mental illness, but we still have a long way to go.


PsychCentral (Original article by the creator of the video)
World Health Organization 
USA Today


Brittany Chan

Click to view 2 Comments

2 Responses to Suicidal? Don’t ask Siri for help

  1. mokoosh March 24, 2012 at 11:27 am #

    Wow. Ok. People are expecting FAR too much from this feature. Really? She’s talking to Siri like there’s an actual person inside the iPhone. Of course Siri can’t figure out something that complex. It’s not a cyborg or something from Star Trek! It’s a PHONE with voice recognition!

  2. Moose G May 27, 2012 at 9:29 am #

    I agree. This is beta technology, with developments occurring. Diabetes is a major cause of death, heart attacks cause death. Perhaps siri shouldn’t actually be released without providing dietary advice, in built videos for performing CPR, and the ability to carry out depression screening questionnaire on everyone who uses it.

    And what happens if Siri is programmed in but the speech recognition doesn’t work – there will be lawsuits alleging ‘i purchased a phone with siri so that x would have someone to speak to if suicidal but they didn’t recognise their slurred speech so they didn’t get help and then killed themselves’.

    Typing this same query into google produces equally variable results.

    I can only assume that the creator of the video is too stupid to realise the limitations of technology, and has wasted time in producing a video trying to cause hysteria, instead of usefully engaging with technology companies about how they can help with this.

    Doing something that people can contribute to in a positive way is more beneficial than trying to whip up a lynch mob for something.

Leave a Reply