Skip to content
The Present

A.I. can now detect suicidal thought patterns with 91% accuracy

Though the sample size was small, the results are compelling.

Currently, there’s no objective way to pick up on whether someone is contemplating suicide. Even for the most seasoned professional, it comes down to experience and guesswork. A therapist, if he or she suspects, simply asks the patient if they are having suicidal thoughts.

The problem is, many people hide it well. Among those who’ve completed the act, 80% denied such thoughts at their last visit with a mental health professional. Because of this, researchers at Carnegie Mellon University wondered if they could find an objective way to detect such thoughts, by picking up the brain activity patterns that coincide with them.

The need is great. 44,000 Americans commit suicide each year, and it’s the second leading cause of death among young adults. Having an objective way to measure for such thoughts could help us develop more effective intervention methods.

Using a machine learning algorithm and an MRI scanner, Carnegie Mellon University researchers believe they’ve isolated the brain signature for suicidal thoughts. The scientists found they could predict who was thinking about suicide with 91% accuracy. They could even separate who’d attempted suicide previously and who had just thought about it. The results of the study were published in the journal Nature Human Behavior.

It’s very important to note that this was a very small sample size. Certainly a larger scale study must be conducted before these findings are considered firm. Still, they’re compelling.

Using an MRI and machine learning, researchers were able to identify the neural signature for suicide. Credit: Getty Images.

Scientists, led by Dr. Marcel Just, have over time collected a number of neural signatures for different thoughts and emotions. They can now tell what a person is feeling or what kind of social interaction they’re thinking about, as each has its own particular pattern.

In this latest study, Just and colleagues wondered if certain thoughts would be altered if a person was ruminating over or contemplating suicide. They recruited 34 volunteers and stuck each into an MRI machine. Seventeen of them were selected for their history of suicidal thoughts, and the other 17 stood as a control group.

Participants sat in the machine for 30 minutes while life and death-related words were projected on a screen inside. These included: death, trouble, cruelty, good, praise, and carefree. The negative words were the focus, because researchers thought they might elicit a neural pattern associated with suicidal thoughts. Each word appeared singularly for three seconds, while researchers recorded the associated brain activity.

Next, all the MRI scans were fed into a computer. A machine learning program examined the data and began to notice differences between typical brain patterns and the kind those with suicidal thoughts or tendencies have. After some practice, the program got good at distinguishing between the two. Those with suicidal thoughts tended to register different readings when death-related words came up, just as suspected. The areas of the brain affected included the left superior medial frontal area and the medial frontal/anterior cingulate. These regions are responsible for thinking about one’s self.

Credit: Nature Human Behavior.

Dr. Just told New Atlas, “People with suicidal thoughts experience different emotions when they think about some of the test concepts. For example, the concept of ‘death’ evoked more shame and more sadness in the group that thought about suicide. This extra bit of understanding may suggest an avenue to treatment that attempts to change the emotional response to certain concepts.” The study was even able to detect with 94% accuracy the nine suicidal ideators who had made a suicide attempt in the past from the eight who had not.

Even so, a couple of things have to be worked out before more interest and research dollars move in that direction. Besides the small sample size, it was already known beforehand that certain volunteers were suicidal. That was important for training the computer to recognize the brain signature for suicidal thoughts. But can the results be repeated with people whose minds are less open to probing? There is great stigma surrounding suicide, so what would be clinically useful is being able to detect suicidal ideation when the subject isn’t forthcoming or won’t admit to it.

Smarter faster: the Big Think newsletter
Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

We haven’t entered the age of the mind-reading machine just yet, according to Dr. Just. “If somebody didn’t want others to know what they are thinking, they can certainly block that method,” he said. They would just simply not cooperate. “I don’t think we have a way to get at people’s thoughts against their will,” he said. Another problem is the equipment is very expensive, in the millions of dollars.

“It would be nice to see if we could possibly do this using EEG,” Just said. “It would be enormously cheaper. More widely used.” Of course, technology over time gets less expensive. But still, critics wonder if the technique will ever be clinically useful. Perhaps in the future, brain readings along with medical records, genomic data, lifestyle data, and more, could be fed into a supercomputer in order to be able to calculate one’s risk of all kinds of physical and mental health maladies, including their risk of suicide.

To learn more about this study, click here:


Up Next