The impact of digital technologies on those with mental health treatment histories is rarely addressed by sweeping reports and recommendations that focus on the impacts of technology on society. In a paper submitted to the Australian Human Rights Commission about promoting fair and equitable deployments of Artificial Intelligence, Piers Gooding addressed this gap. His report showed how infringements on privacy through data collection pose risks to people with disabilities and mental health service-users.
Digital technologies and artificial intelligence-enabled changes to the provision of mental health services are ubiquitous today, facilitating ‘supported decision-making’ in healthcare services, peer networking, face-to-face support, and crisis support. They are often instrumental in monitoring abuses in care provision too. Yet, with the rise of mental health apps and even AI-controlled brain implants undergoing testing, it has also become apparent that consumers of mental health services have become guinea pigs for testing invasive technologies that yield personal, highly sensitive data.
Gooding aimed to make the Commission aware of how classic issues associated with digital rights, like freedom of expression, privacy, and data protection, apply to those with mental health and psychosocial issues, and how threats to these populations model dangers based upon making egregious assumptions and taking information out of context.
He argues that, because norms governing the appropriate flow of information in society are yet to be established in any context (mental health or otherwise), abuses of mental health data should be examined for what they tell us about technologically-prompted human rights abuses across society rather than be understood as exceptional.
He highlighted instances when mental health data was used for non- care-related reasons, which led to the violation of traditional patient protections. One example included the mistaken release of hundreds of students’ personal digital records at a high school in Melbourne that included information about their ‘mental health conditions, medications, and learning and behavioral difficulties.’
Beyond privacy infringements, explicit repurposing of data, such as preventing gun violence in U.S. high schools, threatens to normalize the collection and digitization of student mental health data for distribution through a statewide database.
“Advisors to the Trump Administration are reportedly promoting experimentation to determine ‘whether technology, including phones and smartwatches, can be used to detect when mentally ill people are about to turn violent.’”
This is one of many cases when data-sharing technologies are used by criminal justice agencies to circulate mental health-related data for predictive and preventative purposes.
“In 2017, the Office of the Privacy Commissioner of Canada, for example, found that the Toronto Police Service had released mental health and suicide data which led to Canadians with a documented history of suicide attempts or mental health hospitalizations being refused entry at the US border.”
He also cited the use of ‘GPS’ technology to track forensic psychiatric patients, AI-based suicide alerts enabled by Facebook’s pattern recognition software that operates completely independently of the healthcare system and its ethics, electronic monitoring of social service provision like home-visits, and psychiatric drugs that incorporate inbuilt sensors to track medication compliance. Gooding underlines the need to weigh the benefits to individuals who consent to treatments with leading-edge technologies against these technologies’ human rights implications for society.
In some cited instances, data collected in mental health contexts were used for extraneous purposes. In others, data collected outside of mental health contexts are being used or tested for use in making judgments about mental health and behavioral dispositions more broadly.
This report reflects how mental health data can be used to discriminate against former and present users of mental health services in ways that violate disability-based discrimination under international human rights law. It emphasizes how surveillance is becoming a condition of service provision in many countries, and how, when taken out of context and shared between agencies, personal mental health information becomes weaponized (e.g., for policing, prediction, or denial of rights), even when its collection was premised on a non-infringing, perhaps beneficial, purpose.
Gooding, P. (2020). On Disability Discrimination, Mental Health, and Algorithmic Accountability: Submission to the Australian Human Rights Commission. (Link)
I would disagree with you Emaline. Our collective mental health as well as individual is being stressed to think we can model everything with and through the computer. Tragically, lives are lost in rushing the craftsmanship, the qualities of a more interesting space being shorted out. So, even in the strength of the arguments advanced, which are on point within this domain of language, there is nothing artificial about LIFE that can be experienced. One of the fascinating designed opera houses occurred in Australia because an engineer was able to turn himself inside/out. When an individual turns their thinking inside/out the risk of becoming stuck (as Robert Pirsig described in Zen and the Art of Motorcyle Maintenance) becomes real. He had an interesting description of the analog prior to the digital.
For the record, in traveling to Waterloo to realize my MA, Ithe letters from the psychiatrist would have to be cleared through Canadian Customs just as the US has practiced. The issue is about healing over time as well as addressing the social of the injustice imposed upon us. Looking back, Boltzmann who wrtoe the equation for disorder would visit California and then return home to his native country. He committed suicide.
How do you trace out the step by step post the mind, when in an altered state of operation? The mind is a verb, with even a spiritual aspect, perhaps articulated in knowing from the mystics perspective.
There is nothing artificial about the experiences we have endured. The practice of law on the other hand, seemingly is ignoring the legal challenge that is coming due! What attorney(s) have the stamina to atone for what often has been murder, directly and indirectly or that gives rise to poverty to us, singularly and collectively?
Perhaps the zombies will come, after all.