There’s an alarming increase of suicide among children ages 10 to 14 – more than 150 percent since 1981. Right now, more teens die from suicide than from cancer, heart disease, AIDS, birth defects, stroke, pneumonia, influenza and chronic lung disease combined. To combat the crisis, parents, schools and communities have developed intervention programs and trained adults to recognize the warning signs. Now those programs are getting help from an unlikely source – artificial intelligence.
Artificial intelligence (AI), sometimes call machine intelligence, is processes handles by a computer than simulate human functions. These processes include learning rules, reasoning and self-correction and can be used for speech recognition (like Siri or Alexa), automation (completing repetitive tasks) and for detecting patterns in data. For suicide prevention, AI can go where few parents or educators can – into the devices and apps that kids and teens use. Here’s how:
Searches and scans
Google piloted a program that identifies key phrases related to suicide methods, and returns the number for the National Suicide Prevention Lifeline. But the system has flaws. Google can’t edit webpages, just search results, so users looking for information about how to kill themselves could easily find it through linked pages without using search engine at all. And some phrases and slang can be confusing cries for help—a nuance a computer might not understand.
On social media, major platforms have also taken steps to scan content for self-harming language or images. Facebook, for example, uses AI to quickly flag live videos for police if they detect self-harm or violence and
Improving intervention outreach
For those reaching out for help, AI can help with risk assessment. A Google grant to the Trevor Project, a nonprofit that offers crisis counseling to LGBTQ teenagers via phone, texting, and instant- will use machine learning to assess suicide risk by the response to the question used to start every Trevor interaction: “What’s going on?”
Trevor’s de-escalation rate is 90 percent but sometimes those reaching out have to wait for someone to be available. The AI assessment could flag those who need intervention immediately.
Eventually Trevor leaders want an AI system that will also predict what resources youths need too - to get them the right kind of help faster. And they are hoping the data from their calls will also lead to new insights about the factor that influence suicide risk.
Screening for suicide
Data scientists from Vanderbilt University developed an algorithm that uses hospital-admissions data and some standardized questions to screen patients, even if they haven’t mentioned feeling depressed or having suicidal thoughts. The algorithm that can flag those at risk, allowing healthcare providers to broach the subject and offer support. Currently, most people in the US are only assessed for suicide risk when they actively seek psychiatric help, or exhibit clear-cut symptoms.
You don’t need AI to help someone you know or love who is struggling. The Jason Foundation, regionally headquartered at Dominion Hospital, works to prevent youth and teen suicide. If you’re concerned, about yourself or someone else, call Dominion for a free, confidential and immediate assessment, 24/7 at (703) 538‑2872 or visit our website.