The US National Institute of Mental Health is outfitting a network of hospital emergency departments across the country with “a personalized, computer-based suicide risk screening tool” for teenagers. “We plan to refine algorithms capable of predicting which youth are most likely to attempt suicide,” stated lead developer Cheryl King of the University of Michigan in an NIMH press release.
In phase one of the project, over 6,000 adolescents will be screened, then followed-up with, and the resulting information “will be used to develop a computerized adaptive screen (CAS) for predicting suicide attempts that adjusts its line of questioning depending on responses to previous questions,” stated the press release.
The rates of sensitivity, specificity, and predictive value of the CAS will then be compared with another standardized youth suicide screening tool. Next, a second phase “will validate the CAS and associated risk stratification algorithm, determining the measure’s ability to predict suicide attempts in a new sample.” This will produce “an easy-to-administer screening instrument that classifies youth as high, moderate or low-risk, enabling efficient triaging of resources and identification of modifiable risk factors for treatment.”
The press release stated that this computer-based algorithmic psychological assessment approach towards teens in emergency rooms will be “a tool that can save lives.”
ER Screen Will ID Troubled Teens – Aimed to Help Front-line Clinicians Save Lives (US National Institute of Mental Health Science Update, September 23, 2014)
MIA Editor’s Note: An MIA investigative report available here found that the highest rated mental health and suicide screening tests typically incorrectly identify hundreds of children in every thousand as being at high risk for mental disorders and suicide.
Damn. If a person is diverse and complex enough to want to commit suicide. This gives them the authorization to sell drugs, to remove the diversity, in attempt to make them brain dead so they cannot go through with suicide or even focus enough to conjure the feelings or thoughts required. And its for pure profit, at the persons expense of money and health! 🙁
I know that suicidal feelings are a sign of insight and complex emotions, and thus health. Remove these things, removes the persons health. They did it to me using Zyprexa and Paxil, even when I was only tearful and like my life was over after landing into the bullies care at the Oregon State Hospital.
http://www.obamasweapon.com/
Report comment
I hate these idiotic screening tools which help no one but make people’s lives even more difficult. Interesting is they don’t exactly say what they will do if someone is “high risk”. Lock them up for their own good? Put them under 24h surveillance? Drug them? Give them old school lobotomy? The whole idea is based on wrong premises.
Report comment
“This will produce an easy-to-administer screening instrument that classifies youth as high, moderate, and low-risk, enabling triaging of resources and identification of modifiable risk factors for treatment.” Anything to avoid actually listening to and compassionately helping a distressed youth, and makes sure they become paying customers for life.
Just an FYI, the kids taking the ADHD drugs and antidepressants – and all drugs with black box warnings stating they cause “mania, suicides, and violence” – and other illegal mind altering drugs, will likely be at a higher risk of suicide, because they’re on mind altering drugs. However, adverse effects of drugs are not any of the “life long, incurable, genetic,” but scientifically “lacking in validity” DSM “mental illnesses.” And the youth will require a peaceful and safe place to detox from the drugs, not more drugs.
When will the medical industry stop harming our children for profit? It’s sick.
Report comment
Will the algorithm be treated as propriety information, meaning that the “screening tool” that results is protected from independent tests for reliability and validity? This is the kind of product that the company tied to the chair of the DSM-5 was developing. Will this lead to inpatient commitment decisions based upon the results of the “screening”?
Report comment
Probably. What ever happened to the wisdom that medicine is an art, not a science? Oh, pharmaceutical industry biased “evidence based medicine,” took over.
How long will it take the medical community to realize pharmaceutical industry biased “evidence based medicine” isn’t working? It’s just “managing” real, imaginary, and iatrogenic illnesses for profit.
Report comment
It’s the same with education based on bs standarised testing. It puts “normal” (dull, unimaginative, cowardly, unoriginal…) on the pedestal, it punishes everyone who dares to think outside the box or do/think in a way that someone else has not already did/thought before a thousand times. Dumbing down of the whole society.
Report comment
SE,
Even if members of the medical community realize that drug driven EBM is not working, they won’t speak out due to fears of ruining their career. In other words, it will be business as usual.
I know I sound like a broken record due to saying the same thing when I read nonsense like this article but once again, I am speechless.
Report comment
I have an even more effective screening tool that “adjusts its line of questioning depending on responses to previous questions” – it’s called TALKING TO AND LISTENING TO THE FREAKIN’ KID! I know this is a radical concept in today’s highly technological world, but I am promoting the idea that a human being may be better able to connect with a young person in distress than a computer terminal. On the other hand, given what many so-called “mental health professionals” provide today, maybe their odds are better with the computer…
—- Steve
Report comment
Steve,
You really are delusional if you think talking works. All sarcasm aside, this reminds me of an experience when I worked in special ed school several years ago and had a crisis with my one on one student. I was frantically trying to resolve it when I uttered the magical words, “I am sorry.” Problem solved.
Report comment
The National Institute of Mental Health should look at suicide completion rates of those exposed to the mental health system and those who have no history with it. Perhaps they already know and are unwilling to publish the details for adverse marketing reasons.
Report comment
Exactly. I actually did predict these numbers a long time ago and I’m pretty sure mental health treatment causes suicide.
Report comment
MIA just reported on research about the very thing you just asked. The results of the study were that the more someone was exposed to psychiatric “treatment,” the more likely they were to try to kill themselves. The largest effect was when the person had spent time on a psychiatric ward in the past year. Those people were FORTY-FOUR TIMES more likely to kill themselves! And yet the profession keeps right along pushing for more “treatment” for despondent people.
Report comment
And once they get sent to the psych “hospital” their chances of being successful at killing themselves increases dramatically after being released. Interesting statistic. Your chances of committing suicide go up after being released from the psych “hospital.” What does that say about the wonderful “treatment” that the system metes out to people?
Report comment
Algorithms will predict suicides… and then what exactly will prevent them?
Magic pills?
I thought we already tried that.
Well, some kind of magic, I’m sure.
Forget about good relationships.
Kids need more computer algorithms, more magic.
Duane
Report comment
1 in 10 people in the US on “antidepressants” and the suicide rates are increasing for adults in many different age groups:
http://www.nytimes.com/2013/05/03/health/suicide-rate-rises-sharply-in-us.html?_r=0
Teen suicide is also on the rise:
http://www.foxnews.com/health/2012/06/08/cdc-teen-suicide-attempts-on-rise/
We’ve tried algorithms here in Texas (at least for medications), TMAP was an absolute failure, and a crime.
Drugs have failed, especially with youth.
All this high-tech b.s. has failed, and so what do we do?
We keep trying the same old stuff… again and again, hoping for different results.
Sounds pretty insane, if you asked me.
Duane
Report comment
Oops, I almost forgot:
Where are all the ‘saved lives’?!
They’re hard to find amongst all the bodies!
Duane
Report comment
Here’s one possible outcome of such smart screenings:
http://www.mentalhealthtoday.co.uk/vulnerability-to-radicalisation-linked-to-depression-study-says.aspx
…and from that – hey, our algorithm says he/she is depressed -> potentially dangerous to self and others -> drug him/her and lock them up.
Report comment