In 2018, Californiaās state government began rolling out a new āmental healthā initiative. The tech companies of Silicon Valley were creating smartphone apps that could prompt users to seek mental health care, and the state wanted to provide support. After all, researchers claim that more than half of Americans with mental health problems donāt receive treatment, and one reason for that might be that treatment is expensive or unavailable in certain regions.
Of the thousands of mental health apps in existence today, the state selected two. The first app is called 7 Cups, by a company called 7 Cups of Tea. Theyāre focused on connecting mental health service users, in text-based chat sessions, with what they call ālistenersāāvolunteers who are trained in āactive listening.ā But, according to The New York Times, the company has been plagued with issues, including listeners having inappropriate conversations with their clients and investigations of its alleged financial misconduct.
The other company partnering with the state of California is Mindstrong Health. Their app (branded Mindstrong on March 17, 2020, previously known as Health) is available on the Google Play Store and the Apple App Store. However, you can only use the app if you have been given a code to participate by one of the health insurance companies theyāve partnered with. The company wonāt tell you which companies they work withāitās by invitation only.
Like 7 Cups, Mindstrong helps connect users with text-based counseling. However, the company also developed an algorithm that they claim can detect changes in usersā moods from day-to-day based on the way people use their phoneāthe way they swipe and tap, and the way they type and delete text. In fact, the founders of this company even suggest that their algorithm will be able to predict mental health problems before they occur. As one article about the app claims, āThe smartphone app [ā¦] can tell youāre depressed before you know it yourself.ā
All of this is raising profound questions: Is this a technology to be embraced? Or feared?
A Brave New World?
As 7 Cups, Mindstrong, and other mental health apps are rolled out, the āpublic healthā premise under which theyāre marketed is a familiar one: Psychiatric disorders are regularly underdiagnosed and undertreated, and artificial intelligence (AI) technologies can help get people diagnosed and into helpful treatment. The American Psychiatric Association and other mental health organizations have regularly sounded that theme in the past 20 years, and now tech companies are stepping forward with a high-tech solution.
It is the Mindstrong app that most raises the spectre of Brave New World, Aldous Huxleyās classic dystopian novel of eugenics and psychiatric surveillance. AI, the app seems to promise, can peer into the window of an individualās mind much better than a psychiatristāor other trained mental health professionalācan. As we use our smartphone and computers, our typing rhythms, swiping habits, typing errors and so forth are all data points that can be a compiled into a mental health portrait of the user, one that the creators of Mindstrong claim can successfully diagnose ādepression, anxiety, and other psychiatric disorders.ā
And that diagnosis can then provide a gateway into treatment, with the first stop an online chat.
This AI technology was originally called the ādigital fingerprint.ā It was developed to track hackers without having to catch them in the act. Each person has a unique print. But while the developers are promoting the app as a public health initiative, the Mindstrong website and app store entries describe an AI that would be snooping on you at all timesāostensibly coming to know you better than you know yourself.
And ultimately doing so for commercial purposes that will expand the psychiatric enterprise.
While that is the spooky future that might await us if the Mindstrong app proves reliable, for the moment there are questions being raised about its ādiagnosticā effectiveness and the merits of the online chats. Could the algorithm’s untested flags lead to overdiagnosis and overmedication? What happens if the app flags normal shifts in mood (say, feeling frustrated at work or feeling grief after a loss) as a mental illness? And what happens if you reach out for support but find only poorly-trained online chat technicians?
Mindstrong did not respond to our emails requesting an interview. When we called their customer service line, we were told on multiple occasions that the customer support operators were not authorized to give information to the press, and that they could not transfer us to someone authorized to do so. They also stated that they were unable to provide contact information for any other members of the Mindstrong Health team.
Mindstrong Health: Beginnings
Mindstrong Health began with Paul Dagum, a Stanford doctor and researcher who also holds advanced degrees in theoretical physics and theoretical computer science. Dagum is perhaps most famous in research circles for having developed a statistical method for analyzing data called Dynamic Bayesian Networks in the 1990s. Heās also the owner of numerous patents for artificial intelligence technologyāalgorithms designed to assess large amounts of data to provide predictions.
Before Mindstrong, Dagum created algorithms for surveillance. Some of his previous work was used in the field of ābrand protectionāāalgorithms that designed to help corporations prevent digital āpiracyā of movies, music, and software. He learned to create a digital fingerprint that could be used to identify and track hackers just based on the way they type. The idea was to be able to identify and monitor the activity of people based on how they used their smartphones and computers rather than what content they actually used. This means that people wouldnāt have to be caught committing a crime to be trackedāhackers could be identified based on the way they used their computer for innocent, everyday purposes. And the intention was to surveil the entire populationāevery single person at all times.
Dagum soon began looking for other uses for his digital surveillance algorithms. Healthcare seemed like the obvious next choice. Dagum partnered with Richard Klausner, who was the former director of the National Cancer Institute and former executive director for global health at the Bill and Melinda Gates Foundation. More importantly, Klausner knew how to start a healthcare business that would make money. Heād been the founder and director of pharmaceutical company Juno Therapeutics, which pulled in millions of dollars in investor funding before being acquired by Celgene for $9 billion.
But the last piece of the puzzle didnāt fall into place until Dagum met Tom Insel.
A former director of the National Institute for Mental Health (NIMH), Insel had become frustrated with the inability of the psychiatric establishment to make major improvements in care. After leaving the NIMH, Insel shifted to entrepreneurship, hoping that Silicon Valley could succeed where academia and the pharmaceutical industry had failed.
Insel, a psychiatric researcher, gained fame in the 1980s and 1990s for conducting animal experiments that supported the hypothesis that there were biological underpinnings to behavior. In his experiments on prairie voles, he identified oxytocin as one of the primary hormones responsible for the rodentsā strange, monogamous mating behavior.
In 2002, Insel became the director of the NIMH. He was the second-longest-serving director of the organization, running it for 13 years. By the end of his tenure there, he became known for speaking out against the Diagnostic and Statistical Manual of Mental Disorders (DSM)āthe so-called āBibleā of psychiatry. In 2013, he rejected the DSM as a basis for research, citing its failure to find an objective laboratory test for any diagnosis:
The weakness is its lack of validity. Unlike our definitions of ischemic heart disease, lymphoma, or AIDS, the DSM diagnoses are based on a consensus about clusters of clinical symptoms, not any objective laboratory measure. In the rest of medicine, this would be equivalent to creating diagnostic systems based on the nature of chest pain or the quality of fever. Indeed, symptom-based diagnosis, once common in other areas of medicine, has been largely replaced in the past half century as we have understood that symptoms alone rarely indicate the best choice of treatment. Patients with mental disorders deserve better.
Although he still held to a primarily biological understanding of mental health problems, Insel suggested that the DSM did a poor job of accurately capturing those biological categories. Instead, he said, the DSM was based entirely on categorizing based on symptomsāand even then, its categories were wide nets that ensnared many different diagnostic fish. For instance, the long list of symptoms classed under Major Depressive Disorderāof which only a few are required for diagnosisāmeans that any two people with the diagnosis might have very different symptoms.
Inselās reaction to the validity problems in the DSM, though, was to double down on the notion that mental health problems are biological. According to Insel, we simply havenāt figured out the biology yet. His replacement for the DSM was the RDoC (Research Domain Criteria)āa set of assumptions that he wanted to guide research. Chief among these assumptions was the idea of searching for biological differences first, and then trying to create mental diagnoses based on the biology.
It was an interesting shift, and Insel says it led to ācool papers by cool scientists,ā but nothing that actually improved mental health care. In an interview with Wired, Insel describes investing billions of dollars on neuroscience research, only to come up empty-handed:
I spent 13 years at NIMH really pushing on the neuroscience and genetics of mental disorders, and when I look back on that I realize that while I think I succeeded at getting lots of really cool papers published by cool scientists at fairly large costsāI think $20 billionāI donāt think we moved the needle in reducing suicide, reducing hospitalizations, improving recovery for the tens of millions of people who have mental illness. I hold myself accountable for that.
But Insel remained undaunted. In 2015, just after leaving the NIMH, he began working with Verily, a life sciences tech company owned by Googleās parent company Alphabet, Inc. After a short stint there, he left to co-found Mindstrong Health with Dagum.
(This year, California Governor Gavin Newsom appointed Insel to be a āspecial adviserā on the stateās mental health system, despite the fact that Insel remains president of Mindstrong, a tech company looking to expand its reach into that very system. According to STAT, Insel will recuse himself from any conversations about Mindstrong.)
In his Wired interview, Insel provided his justification to moving into the tech sector: āIf biomarkers canāt diagnose mental health issues, maybe a ādigital phenotypeā can.ā
The Digital Phenotype
āDigital phenotypeā is a neologism that borrows a term from genetics. In genetics, phenotype refers to the observable traits that are the expression of the genotype (genetic code) in the environment. So, while the genotype refers to someoneās DNA, their traits are also influenced by the environment to create the phenotype. Figuratively, though, the word phenotype is sometimes used to mean an identifiable trait, like a fingerprint.
Dagum initially called his work a ādigital fingerprintā when it was intended to be used to track hackersāperhaps because of the association fingerprinting has with law enforcement. Now that the algorithmās being used for healthcare, thereās a new euphemism for it: a ādigital biomarker,ā insinuating that the digital information assessed by the app bears some relationship to a personās biology.
So how does this app actually work?
First, the app installs a special keyboard on your phone. That way, the app can record information about the way you type at all times (whether you have the app open or not). The app tracks this data, and Dagumās algorithm correlates it with ākey domains of brain function including Cognitive Control, Working Memory, Processing Speed, Verbal Fluency, and Emotional Valence.ā
Itās unclear exactly what data is being assessed by the algorithm. The Mindstrong website uses words like ātaps, scrolls, swipes, and texts,ā but the founders have implied that other metricsāfrequency of texting, whether or not you reply to texts, length of replies, and other dataāare also collected. One thing they are careful to say is that the app does not record any data about contentāwhat youāre saying, which websites youāre visiting, who youāre talking to, or location data.
Is it possible that the rhythm of your clicks on your phone can be used to diagnose mental health problems? And is it possible that these activities can reveal whether youāre experiencing slight shifts in mood from day-to-day?
According to the Mindstrong website, āBelieve it or not, how you use your smartphone has been scientifically validated as a way to measure things like how you’re doing and your mood.ā
The Mindstrong website offers a single published article that they claim demonstrates the success of the algorithm. That article is a brief communication in the new journal npj Digital Medicine. It was written by Paul Dagum and describes a proof-of-concept study he conducted on his algorithm.
In Dagumās study, 27 āhealthy volunteersā were recruited on social media. They had no mental health diagnoses and no cognitive problems. Dagum recorded their texting behavior over the course of a week, then tested his algorithmās ability to predict their scores on cognitive tests. He concluded that it had done so with some accuracy.
Itās important to note that these scores were only associated with cognitive functioning, not any mental health measures, and that this small group of participants were all considered perfectly healthy subjects. Ā āThese results should be considered preliminary until replicated in an independent sample,ā he wrote.
Online Therapy
Although that is the extent of the published research on the app, itās presented on the app, and referred to on the website, as a āscientifically validatedā measure of mood and mental health functioning in people with āserious mental illnessā like bipolar disorder and borderline personality disorder. Given the nature of that one study, the claim is clearly an exaggerated one.
A second question, though, is whether the app will in fact provide useful help if it does detect a change in āmental health functioning.ā
The Mindstrong website states that its app connects users with psychiatrists and ācredible therapists.ā Users can video chat with psychiatrists about their medications, although itās unclear how often that happens. The bulk of the treatment offered through the app involves the ācredible therapists. āTheyāre all thoroughly vetted for skills like empathetic listening and crisis management skills,ā the website states.
However, the ācredible therapistsā are not in fact required to be licensed to practice, or to have expertise in any evidence-based treatments. Theyāre more like 7 Cupsā online listeners, providing 20-minute text therapy sessions, which the company claims are āevidence-based.ā
āOur approaches include Cognitive Behavioral Therapy (CBT), Coping and Emotion Regulation, Psychoeducation, Crisis Management, Reflective Listening, and Empathy,ā the website states.
While CBT delivered in person by a licensed therapist could be described as evidence-based, there is no good research that text-based CBT delivered in 20-minute sessions by unlicensed therapists is helpful. As for the rest of the listed therapeutic approaches, there are no studies that promote text-based ālisteningā and āempathyā as stand-alone treatment for any mental health condition.
So, the AI diagnosis leads to ātreatmentā thatāitās fair to sayāis of a questionable sort.
Evaluating Mental Health Apps
John Torous, psychiatrist and clinical informaticist, is the director of the digital psychiatry division at Beth Israel Deaconess Medical Center (BIDMC). Heās also the head of the American Psychiatric Association (APA) work group on the evaluation of smartphone apps. He took the lead on developing a system to evaluate mental health apps, which is available at the APA website.
Torous is not affiliated with Mindstrong Health.
The system Torous developed for evaluating apps involves numerous steps. It includes asking questions about who developed the app, what their business model is, whether the app costs money or runs advertisements, what the privacy policy is, how the app collects and stores data, what the app claims to do, whether there is any published research showing that the app works, and how easy the app is to use.
āA lot of the digital mental health offerings directed to consumers are not evidence-based,ā Torous said in an interview with Mad in America. āTheyāre often not effective, and theyāre often not safe, in that the business models take peopleās personal health data, market it, sell it, or who knows what theyāre actually going to do with that data.ā
The APAās evaluation system, however, does not provide guidance on what to do with the answers to all of those questions. The website simply states that deciding which app to use is a āpersonal decisionā and that the goal is to āmake APA members aware of very important information that should be considered when picking an app.ā The system doesnāt actually score or rank apps, or provide a way to identify apps that lack an evidence basis and may pose any number of risks.
Instead, it serves as a reminder of what questions to ask in order to thoroughly vet an app before use.
mindLamp
Torous himself is currently engaged in research on another, similar mental health app, funded through a donor at BIDMC.
Although the apps share similar names (Torousā app is officially known as mindLamp, although internally itās often referred to as just LAMP), when MIA spoke with Torous, he highlighted the contrasts between his app and Mindstrong.
For instance, Mindstrong is already connected with health insurance companies, and the companyĀ behind it claims to be providing healthcare services, despite the lack of any published studies on its effectiveness.
In contrast, Torous stated that LAMP was still in the early stages of being tested. Ongoing studies are intended to demonstrate whether the app works or not. Torous said that itās important to verify the effectiveness of an app before making claims that it is effective, and he believes his studies can help to provide that information.
However, like Mindstrong, LAMP is already being used as part of clinical care (although Torous verified that the algorithm itself is not being used yet). The app tracks the data it collects, which is then shared with the user and his or her clinician (and potentially with family members if consent is given).
The website for LAMP states that the Massachusetts Psychosis Network for Early Treatment (MAPNET) uses the app āat their sites across Massachusetts.ā The site also notes that 15 research institutions around the world are using the app for various research studies.
Open-Source
According to Torous, LAMPās code is open-source: free and available online. In contrast to Mindstrongās secrecy, Torous said that the LAMP program is intended to pilot software that others can use freely and test to see if it can actually achieve better outcomes.
āBuilding a tool is no longer a huge technical challenge,ā Torous said. āItās hard to regulate what other people may be doing with the software.ā
These appsālike Mindstrongāalready exist, and theyāre already being used, promoted, and monetized. But when their code is secret, it makes it impossible for clinicians and users to identify whether the apps are safe or effective. Torous wants to provide transparency to the public about how the apps work and whether thereās evidence for their effectiveness.
LAMP and Mindstrong: Both Different and Alike
LAMPās processes work a little differently from Mindstrongās. Insel and Dagumās app downloads a keyboard onto the userās phone and records details about how the person types on that keyboard.
Torous said that this could seem invasive, and noted that LAMP does not download an interface onto the phone in this way, nor does it record typing information.
However, both apps record GPS information, call and text logs, and accelerometer information (which is a proxy for exercise and sleep habits). Torous also discussed his appās use of such metrics as whether the screen is on or off (another proxy for sleep habits), and answers to surveys and cognitive āgames.ā LAMP also records things like how often a person leaves the house (if at all), how far they travel, and other such details. But like Mindstrong, the LAMP app doesnāt record the places to which the user travels.
Both Mindstrong and LAMP have as their goal the identification and prediction of mental health changes, and both link the resulting ādigital fingerprintā to a āclinicalā intervention.
Torousā current research is a study on whether the app can predict relapse in psychosis. The participants are people with a diagnosis of schizophrenia or schizoaffective disorder, and they will use the app for a year. The goal is to develop a unique baseline of functioning for each individual in the study so that when a personās behavior changes, the app will identify it.
But what happens then? For Insel and Dagumās app Mindstrong, an alert is sent to the Mindstrong care team, which will then reach out to the user. Torousā app records the data about that change, which then becomes a point of discussion between the clinician and patient.
Torous noted that there are a lot of reasons a personās behavior could change, not all of which imply a ārelapse in psychosis.ā For instance, if a person suddenly begins leaving the house more often and traveling to many places, that could be a sign of manic behaviorāor it could mean that the person made a friend, or is simply trying to be more social.
With this information in hand, the clinician will show the patient the data (leaving the house more often) and ask what it means. Ā āUntil we know how well this worksāwhen itās right, when itās wrongāitās really hard to deploy clinically,ā Torous said. āIf someone has a relapse, weāll ask them, āHereās the signal we saw, does this make sense?āā
āWeāre not going to automatically send an ambulance to their house,ā Torous said. āSomeone on the care team could call in; it could be a prompt for a family member to check in on someoneā if thereās consent.
But, he admitted, another app using his free, open-source algorithms could easily escalate to sending an ambulance in the interests of care. āI think that was proposed in California at some point,ā he said. āFacebook, apparently, may be doing this already.ā
LAMP for College Students
Torous is also planning a study to test whether LAMP can be used for college students. He said this research is currently in the focus group stage, asking 100 college students about their needs and concerns. He hopes to create an adapted version of the app to deploy on college campuses.
The college environment can be a microcosm of the world, bringing to light the potential benefits of mental health surveillance, Torous said. For instance, the app might be able to identify if students were becoming socially withdrawn, which could signify depression.
But this, of course, raises obvious privacy concerns. Who gets the information that the student is becoming socially withdrawn? The college counseling center? The campus administration? Campus administration may already be incentivized to force students to take a leave of absence if there is a mental health concern, which can result in students unable to return to school, going into debt, and sometimes even a mark on their permanent record.
Other questions abound. Are students going to be forced to attend counseling? Involuntary counseling is not considered an effective intervention. How does the app distinguish between what could be a teenagerās first real romantic breakup, for example, versus an emergency of severe depression? Is there a difference?
And how does the app tell the difference between āmanicā behavior and college students trying out new ways of being, like going to parties for the first time? What happens if the student uses drugs or alcohol? How does the app record that data, and who gets to see it?
Torous acknowledged these concerns. āMental health in general only works when thereās trust and transparency,ā he said. āWe do have to understand what are limitations of this data, when this is useful, when it is not.ā
Nevertheless, he said he hopes there is a way to adapt the app for college use that will be acceptable to the students and administration.
Medication Feedback
Much as the app might be used to monitor behavior, it could also be used to monitor response toāand compliance withādrug treatment. āImagine starting a medication,ā Torous said. āThis is a way we can keep a very close eye on people as theyāre starting a new treatment.ā
The app might be able to chart a change in behavior that would inform patients that their antidepressant is āworking,ā even though they may not believe it is and havenāt noticed a difference in their own mood. āMaybe theyāre still feeling a little bit down or depressed, but theyāre kind of sleeping better, moving around betterāit can be a good way to start a discussion,ā Torous said.
At the same time, doctors might use the app data to check their own biases. For instance, if a patient says the medication is not working, or has intolerable side effects, and the app supports that conclusion, then the doctor may be more willing to believe the patientās own account of his or her reaction to the drug.
Both of these possible uses reflect the same doctor-patient relationship that bedevils psychiatry today: The patientās own self-assessment is not seen as fully reliable, and the doctor can use the data from the app either to convince the patient that his or her insight is mistaken or to agree that the patientās insight appears correct. Either way, the presence of the app reinforces the notion that patients are not reliable witnesses to their own responses to psychiatric drugs.
A Brave New World Is Already Here
āIn the clinic we donāt use the algorithm yet,ā said Torous. However, LAMP itself is already being used by clinicians at BIDMC and at MAPNET, collecting data and monitoring patients, without any evidence that it improves outcomes.
āWe havenāt seen definitive results that say we need to adopt these technologies in routine care today, in any field,ā Torous said.
As for Mindstrong, itās unclear if their algorithm is actively being used yet, but their app has been rolled out with health insurance backing in multiple states (itās unclear exactly which insurance companies and states are utilizing the app, because Mindstrong did not respond to our requests for comment).
āOur outcomes look promisingālowering the inpatient readmission rate, ER admission rate, mental health costs and physical costs,ā the Mindstrong website states. āWe’ll publish our care model findings in 2020.ā
āI havenāt seen any published papers about mental health patients,ā Torous said, when asked about the research base for Mindstrong. āI havenāt seen robust clinical evidence.ā
Meanwhile, early user feedback on mental health apps, at least by users of 7 Cups, has not been very positive. The online ālistenersā have been criticized on reddit for being robotic, unhelpful, pitying rather than empathetic, and overeager to report people for suicidality, among other issues. Some users complain of feeling worse and even being retraumatized after using the service.
However, the mental health apps are now here. And they herald a future of greater mental health surveillance, with AI tracking your phone use and GPS data at all times, promising to provide a ādigital fingerprintā of users, first to diagnose and then to constantly monitor the activities of those diagnosed with depression, bipolar disorder, psychotic disorders, and borderline personality disorder.
Leave your house and stay out too late and perhaps a āmanicā button will ring. Stay in bed too long and donāt go out and perhaps a ādepressionā button will ring. And do you think your drug treatment isnāt working and causing harmful side effects? Your mental health app will help your prescriber decide if you know what you are talking about. But donāt worry, youāll also be mandated to check in with your unlicensed, poorly-trained ālistenerā (even though thereās no evidence that will actually help you feel better).
****
MIA Reports are supported, in part, by a grant from the Open Society Foundations
Yep and don’t be surprised as part of their stepped up on-going brain washing they seek to take down your site Robert. It’s started happening else where.
Sorry I have to say something about a recent Peter Breggin interview, he is partly seeking push blame onto China re the opiod epidemic I don’t doubt that is part of it, but…..
American Arthur M. Sackler was a psychiatrist. The Sackler familys Purdue Pharma’s OxyContin is directly implicated in the opiod epidemic. Ofcourse psychiatrists know how to exploit, market and profit.
Look at who the Sackler family fund: The cream of the art world – they have an enormous amount of money from the mass destruction of others – save for MoMA New York… oh wait MoMA are funded by Rockefeller – who have also funded eugenics. So shame on the whole art world.
https://en.wikipedia.org/wiki/Sackler_family
Report comment
Massachusetts Attorney General Maura Healey sets it out:
https://www.mass.gov/files/documents/2019/01/31/Massachusetts%20AGO%20Amended%20Complaint%202019-01-31.pdf
Oh and the two other brothers Mortimer and Raymond were psychiatrists as well.
Report comment
Undercover of this virus very many more elderly will be bumped off and I’ve nodoubt this will include ‘mental patients’.
https://www.youtube.com/watch?v=4Zw9C73TtMo
Report comment
Yes, and we will be waiting behind the curtain š Opportunities galore in such times for those who have the stomach for it.
I assume our Minister for Health is a little disappointed that his Euthanasia Act hasn’t been fully implemented at this point in time. And I wouldn’t suggest he was expecting this virus situation to arise and present the opportunity to reduce the stress on the aged care facilities. But it does show the rather weak ‘protections’ provided in that legislation, particularly when your anasognosia can override your ability to ‘volunteer’ to be assisted to die.
Oh well, there will be other flu seasons. Turn, turn, turn.
Report comment
Painting was once a way of selling secrets, it would seem. Now, it is well known that art historians have devices to see under the layers of paint. And what were the secrets to be sold? Whatever they were, they were secrets worth enough, to some companies, to develop devices to see under layers of paint. Surely the devices were not developed for the sake of art history.
Report comment
Perhaps someone can create an app, that illuminates what is going on.
I think in light of this, we will see a huge spike in suicides, and although psychiatry and politicians are not clever enough to create plans of getting rid of it’s minions, this will accomplish.
There is a drawback as it will affect everyone. And that was the “mental illness” all along. Peasants first, rulers next. No one gets a free pass.
Report comment
Monitoring if your foot twitches while you sleep will become standard, AI and movement sensors are perfect for that.
Report comment
OUTRAGEOUS! ENTIRELY OFF ITS ROCKER! DEPLORABLE!
LET ME WARN YOU!
THERE IS NO INTELLIGENT TECHNOLOGY CAPABLE OF SUCH A THING!
WHERE HAS EVERYBODY’S INTEGRITY GONE?
(Insel, prior NIMH! What have you done!?)
Report comment
Yes exactly, where is the integrity????
There is a group in the UK called Big Brother Watch
pushing for a total ban there of facial recognition tech, and also trying to address other aspects of surveillance culture. I think we need to work with those people and all people who are critical of psychiatry or anti-psychiatry NEED to get up to speed on surveillance culture if they aren’t already learning about it.
Good doc to start with, Surveillance Capitalism:
https://www.youtube.com/watch?v=hIXhnWUmMvw
The overall privacy violations we are suffering are far worse than many people probably even understand.
Report comment
Aritifical intelligence.
Sure. Where did they find this AI? On another planet? Who judged it to be intelligent? Someone with “intelligence” I suppose.
I thought psychologically and philosophically we had not decided on WHAT intelligence or genius is.
Obviously the idea is absurd.
Lol, but the not insightful swallow this, hook line and sinker.
Their real intelligence will come to light when they “create” so many peasants, that there will be an uprising, because they simply have no resources to make a lot of peasants, and feed/house them. The economy will suffer, the countries will become weakened.
The countries that will be most successful are those who don’t go out of their way to”creating” internal problems.
I will send the article to my kids.
Report comment
This trend terrifies and enrages me. It’s also a bit absurd, assuming people aren’t even able to know their own moods without a tech assist from essentially an outsider.
This article in today’s New York Times tells of a related phenomenon, government use of cell phone and other data to track people’s movements during the coronavirus crisis: https://www.nytimes.com/2020/03/23/technology/coronavirus-surveillance-tracking-privacy.html.
The story notes, “[R]atcheting up surveillance to combat the pandemic now could permanently open the doors to more invasive forms of snooping later.”
Report comment
Yes Miranda, but stuff like this arises because there are too many who partially
believe “psychiatry goes too far”.
This is the outcome of “sort of supporting psychiatry and sort of supporting clients”
There are times when dismantling is needed.
Report comment
I agree with you. We need a massive deprogramming campaign, like they have for people who have been in a cult. Maybe the (few) therapists who are critical of psychiatry and want to be real allies could create a deprogramming campaign they could run with pro-psychiatry “clients”. Basically make it the opposite of what psychiatry does. Maybe have some deprogramming commercials running in the waiting room.
And if it’s family therapy, at the beginning of each therapy session family have to swear on the Bible or some other text of personal significance to them that they will not say you have a “mental illness”.
Report comment
Thanks much for this article. Its so appalling that the Medical Model / Silicon Valley mentality can drive such large-scale unhelpful services. It seems that a big part of it is driven by the desire to minimize labor costs and automate things to the point that the user is treated like a widget that is processed by the app rather than a person who can be an active participant in their own journey.
We have been providing a peer support app for over ten years. Our motto is “REAL HUMANS. REAL HELP. REAL TIME.” Most of our services have been delivered via live, small video-based groups (8 to 12 participants) facilitated by Certified Peer Specialists with very positive participant ratings and in a study we did a few years ago, strong participant-reported outcomes. In our new platform we are beginning to allow users to track mood and other information that we hope will be useful feedback to the user and allow us to eventually offer relevant encouragement and development tips. We just don’t understand how app developers can take the user and the human touch out of the equation. To learn more about what we do you can go to http://www.peersupportsolutions.com.
Report comment