Siri, the artificially intelligent “assistant” on the iPhone 4S, utilizes some of the world’s most advanced speech-recognition technology.  She can remind you of appointments, calculate math problems, suggest restaurants for dinner, and write emails.

But apparently, she can’t help you if you’re suicidal.

In this ten-minute long video, a young woman pretends to be suicidal and tries to get Siri to respond appropriately to expressions of suicidal ideation.

Despite many different direct commands, Siri proves tragically unhelpful.

 “I’m thinking about killing myself,” the girl in the video articulates into her phone. The iPhone beeps.  “I couldn’t find any suicide prevention centers,” Siri replies in her sympathetic robotic voice.

“Siri, I need psychological help.” Beep.  “I found ten motorcycle dealers.” Perhaps Siri couldn’t locate any suicide prevention centers because none existed in the vicinity (which is a problem in and of itself).

However, even when directly asked, Siri is unable to dial 1-800-SUICIDE or even suggest a suicide hotline.  In one instance, Siri answers the woman’s suicidal pleas with an Internet search on ways to kill herself.  It took the woman a total of twenty-one minutes before Siri was finally prompted to perform a search for suicide hotlines and yield useful information.

In real life, that would have probably been too late.  Someone truly suffering from depression and suicidal thoughts would likely have given up in frustration at the very beginning.

Worldwide, one person commits suicide every forty seconds.  Many more live in anguish, considering ending it all.  Given the profound impact suicide has on our society, it is unfortunate that Apple did not consider the need to make mental health resources easily accessible to iPhone users via Siri. It takes a tremendous amount of courage for a person suffering from depression or another mental illness to seek help.  Now that we carry such powerful gadgets in our pockets and our homes, we need to use this technology to help these people gain access to care and support.

With the Internet, texting, social media, mobile apps, and a myriad of other technologies available, it seems it would be easier for people in crisis to access necessary mental health care.

In some ways, people have begun to use these avenues to aid those suffering from mental illnesses. Mobile apps are now available to help people with depression chart their moods and access crisis lines.  The recently reviewed Panic Attack Aid app assists anxious people in relaxation exercises.

In addition to mobile technology, social media is another potential tool for suicide prevention.  With the rise of cyberbullying and suicidal status updates, Facebook recently announced a new feature that allows users to anonymously report their friends’ suicidal posts.  The person who posted suicidal content will then receive an email from Facebook with instructions on how to initiate a private chat with an online crisis representative from the National Suicide Prevention Lifeline.

This is a good start, but it is not enough.  In this video, Siri shows us that we are falling short in utilizing what could be very important, life-changing tools. Social media has just begun to address mental health issues among its users. There is a noticeable dearth of mental health apps as compared to health apps in other fields.

Perhaps it is partly because the stigma of mental illness still exists, even in the twenty-first century.  Or maybe it is because emotions make many of us uneasy to some degree; not only is it difficult to know what to say or do for a loved one in crisis, but emotions are not easily quantifiable like blood pressures, calories consumed, or miles run.  If used properly, technology–especially web and mobile apps–has tremendous potential to help fight mental illness, but we still have a long way to go.


PsychCentral (Original article by the creator of the video)
World Health Organization 
USA Today