Dismissing patients’ lived experiences, or epistemic injustice, is a common issue across the medical field. A new article published in Medicine, Health Care, and Philosophy raises concerns about epistemic injustice as it is related to the use of digital phenotyping in mental health care.
Digital phenotyping could contribute to already existing problems of epistemic injustice in the field, which in turn may lead to ethical issues such as involuntary commitment or treatment. Given that the use of digital data in healthcare is not something that is going to go away, the authors conclude by offering suggestions for more ethical implementations, such as privileging lived experience and, in turn, treating digital phenotyping as a tool to aid in understanding patients’ experiences, rather than as an objective authority.
The authors, led by Stephanie Slack of the School of Philosophical, Historical, and International Studies of Monash University in Australia, write:
“While most clinicians would presumably not ignore patient testimony altogether, the perception that digital phenotyping offers an objective and hence more reliable view on whether a person is experiencing mental illness may offer a level of legitimacy in at least some instances to ignoring patient testimony and the patient’s interpretation of their own experience. In healthcare systems that are constantly striving for lower cost and more efficient healthcare, we should be alert to this risk. We should, therefore, be cautious about introducing digital phenotyping as a mechanism that has the potential to result in misdiagnosis, substandard care, or an unjustified detention or treatment order.”
Applications of digital phenotyping to the mental health field include its potential aid in the diagnosis and detection of mental health issues. While there are many different definitions of digital phenotyping across the literature, Slack and Barclay define it as:
“. . . the process of continuously collecting and analyzing digital data derived from human interaction with digital products to make assessments or inferences about illness.”
Digital phenotyping can include active interaction with technology, such as responding to prompts for data, but it also includes passive data collection, in the form of monitoring activities like GPS, interactions with screens, email use, and so on. For example, one study used a digital phenotyping device to monitor variability in heart rates, electrodermal activity, and GPS movement in individuals diagnosed with schizophrenia in an attempt to monitor and predict relapse.
At this time, there are limited clinically validated tools for digital detection and diagnosis. A major obstacle for researchers to overcome is creating tools that go beyond simple inferences from digital data. Assuming that low battery levels on a person’s phone are related to depression misses the fact that there are many other explanations why a person may have a low phone battery.
As such, studies investigating digital phenotyping pull from multiple data points, such as self-reported surveys, GPS data, call/text logs, as a way to assess sociability, screen time, etc., as a way to avoid making inferences based off of one point of data.
Although the sample sizes are small, recent studies have shown promise in using digital phenotyping to predict relapse in schizophrenia. Detections of anomalies in data, such as participants going to different locations than usual or differences in number and length of phone calls, were shown to have 89% sensitivity and 75% specificity for predicting relapse in schizophrenia. Interestingly, however, the detections of anomalies were much higher in passive data collection of mobility and sociability than in active data, which consisted of self-reported surveys, providing a clear example of how digital data may be favored over subjective experience.
Moreover, despite these encouraging results, the researchers of the study acknowledge that smartphones and digital technology do not fully capture the person’s context, and, therefore, can miss key pieces of information. Much like the example discussed above, they describe how leaving a phone on a table could be misinterpreted as inactivity or sleep, which, again, brings to light the issues that come along with taking digital data at face value.
While advocates of digital phenotyping have argued that it allows for a more personalized and objective approach to psychiatric diagnosis and symptom monitoring, others have raised concerns that it contributes to the further dismissal of first-person narratives. Elsewhere, others have critiqued digital mental health technology as erasing the lived experience of psychosis.
Slack and Barclay describe epistemic injustice as it relates to the mental health field:
“In clinical practice, several empirical studies have demonstrated that psychiatric patients’ testimony is frequently dismissed by healthcare professionals in care settings, particularly in relation to patients’ medication preferences. Psychiatric patients are often not viewed as credible knowers about their own subjective experience of mental illness, or of the things that are beneficial or harmful to them such as medication and its side effects.”
They highlight different forms of epistemic injustice, such as testimonial injustice, which consists of discrediting the person due to the social group they belong to, particularly those belonging to marginalized social groups. Individuals with disabilities are often targets of testimonial injustice, as fixation on the disability can lead clinicians to question patients’ self-reports and fail to consider other explanations. As an example, an individual with mobility issues who has a rash that they believe is due to an allergic reaction may be told by their physician that it is because of their arm rubbing against their wheelchair.
Epistemic overconfidence is another form of epistemic injustice and arises from excess credibility given to clinicians’ knowledge. Placing too much faith in their own perceived knowledge can lead to an intellectual arrogance that results in some clinicians failing to think critically about diagnosis, consider alternative explanations, or seek a second opinion.
Hermeneutic injustice arises from structural prejudice, driven by inequities in social power, that leads to a lack of hermeneutic resources, or shared tools like narratives, concepts, etc., to help individuals make sense of and communicate their experiences. As an illustration, sexual harassment always existed prior to the conception of the term, ‘sexual harassment.’ Cisgender women were largely the targets, and for years, they had no lens by which to understand or explain their experiences, until they gained more prominence and power in the workforce and began to develop language around sexual harassment.
Hermeneutic injustice also prevents individuals from marginalized groups, such as those with psychiatric diagnoses, from making contributions to shared epistemic resources, as they are not perceived as reliable sources of knowledge. The authors caution that due to this unfair perception, individuals with psychiatric histories are particularly at risk for experiencing epistemic injustice in relation to digital phenotyping. If patients disagree with or would like to challenge conclusions made as a result of digital data, it is likely that their voices will not be heard, as patients with psychiatric histories are often labeled as lacking insight or “resistant” when they disagree with clinicians’ perspectives.
Slack and Barclay describe an example of this type of testimonial injustice:
“Paul Chrichton and colleagues provide an example from Crichton’s experience as a medical student where a young man in an inpatient unit in Munich claimed to be the relative of the then Soviet leader. His testimony was dismissed by the treating psychiatrist as a delusion and evidence of psychosis; it turned out that the patient was, in fact, related to the Soviet leader.”
Additionally, “objective,” scientific data tends to be privileged over lived experience. As such, the authors predict that the reliability of digital data will likely be overstated by clinicians, and may overshadowed as more “objective” than self-reports – especially if those self-reports contradict the hard data.
This can and has been shown to lead to especially dangerous territory in the medical field. Patients’ testimony has been shown to have less of an impact on physicians’ decisions to prescribe or not prescribe medication than information obtained from automatic prescription drug monitoring programs, potentially leaving patients without access to needed medication. In the context of psychiatry, favoring digital data over first-person experience could result in harmful outcomes like misdiagnosis, insufficient care, and involuntary commitment or treatment.
Digital phenotyping and favoring seemingly objective data over lived experience could also further worsen hermeneutic injustice. In fact, we are already seeing this trend as people with first-hand experience are typically not included in the development of digital technologies for mental health monitoring and diagnosis. A recent review found that only 4 out of 132 research papers investigating these technologies included people with lived experience in any meaningful way.
The authors conclude by offering recommendations as to how digital phenotyping in psychiatry can be utilized ethically and in a way that does not lead to further epistemic injustice. Emphasizing lived experience is a key place to start, as people with first-person understandings of mental health issues offer invaluable knowledge of what it is like to struggle with mental health that no “objective” data could possibly come close to attaining.
The authors point to peer-support groups like Hearing Voices, who offer understandings of the lived experiences of psychosis that cannot be captured by those who have not had those experiences – such as that they are not just “symptoms,” but rather communicate something deeper, such as past trauma, or are reflective of a gift or spiritual experience, or an expression of emotional distress.
Listening to and believing people with lived experience allows for the use of digital phenotyping as a useful tool or a resource that can be used in the aid of understanding and treating mental health issues, rather than as an authority that should not be questioned. In addition, individuals with first-person experience need to be included in the development of these tools, as they have expertise that is key to truly understanding the day-to-day experience of psychiatric issues. They can offer insights into symptoms or areas that might be important to monitor, as well as how they would want monitoring to occur if it does.
Taking lived experience seriously and incorporating individuals with this experience in the development of digital tools is vital to ensuring that digital phenotyping is implemented in psychiatry in an ethical manner that respects and privileges the personhood of the individuals for whom they have been developed.
Epistemic injustice in the field of psychiatry has been recognized and discussed by others, who have pointed to its coercive nature. While epistemic injustice has been found to be an issue for those with psychiatric histories as a whole, some articles have highlighted the impact it has on particular groups, such as voice-hearers, as well as young people and those experiencing delusions.
A growing push for the inclusion of lived experience in knowledge production, mental health research, and treatment is evident across the literature, and while progress has been made, there is still much work to be done. A recent article pointed to how psychologists with lived experience endure mental health-related stigma, and as emphasized in Slack and Barclay’s article, we are still not including people with first-hand knowledge in the development of tools for understanding and treating mental health issues.
In order to avoid additional marginalization and oppression of vulnerable populations, it is critical that the field of psychiatry make further efforts to integrate people with lived experience as valuable sources of knowledge, especially as we tread further into the territory of digitizing diagnosis and treatment.
****
Slack, S.K., & Barclay, L. (2023). First-person disavowals of digital phenotyping and epistemic injustice in psychiatry. Medicine, Health Care and Philosophy. https://doi.org/10.1007/s11019-023-10174-8
As described digital phenotyping sounds like surveillance of a prejudicious and very biased kind. With very high stakes for patient safety/
It smells so dystopian, totalitarian, opressive and very harm prone.
I find difficult to wrap my mind around how can anyone give informed consent for diagnosis/treatment based on “digital phenotyping”. Let alone for research.
Report comment