Mobile Apps for Mental Health Lack Transparency in Data Sharing

Research illustrates privacy concerns with how mental health applications collect and share users’ data.

4
1619

A new study, published in the JAMA Psychiatry, overviews the data management practices of popular mobile mental health applications marketed to help with symptoms of depression and smoking cessation. The research found that most of these apps are not adequately transparent in how they manage data collected on its users. Most commonly, this entails sharing data with third parties like Facebook and Google without properly disclosing such practices. Such concerns are especially relevant in light of trends toward the increasing use of online therapy and other new technologies for mental health purposes.

According to the authors of the study, lead by Dr. Kit Huckvale, a postdoctoral researcher at the Black Dog Institute:

“The emergence of a services landscape in which a small number of commercial entities broker data for large numbers of health apps underlines both the dynamic nature of app privacy issues and the need for continuing technical surveillance.”

Going further, they note:

“The lack of information provided about data processing jurisdictions observed in this sample suggests that developers may either be unaware of this risk or do not appreciate its significance for potentially sensitive health data.”

Photo source: VPN Express

With more data collected on each of us today than ever before, how information about mental health is collected, stored, and circulated has become an increasingly pressing concern for service providers and users alike. Many mental health service users have raised objections to the ways information about their treatment, in general, has been shared with government agencies and insurance companies. For some users, this has even lead to unanticipated restrictions — like being blocked from entering a foreign country or being denied insurance coverage — in other areas of their lives.

While The Health Information Portability and Accountability Act, otherwise known as HIPAA, was passed in 1996 to standardize the protection of Americans’ health data, procedures for gaining access to one’s own data can prove lengthy, discouraging users from learning what information about them is already available to a growing array of private and public organizations.

Concerns about personal data related to mental health are all the more central to services that are provided exclusively online. Recent research suggests that e-mental health resources, like mobile apps, are becoming integrated into treatment in ever-newer forms, despite few regulations existing about how they can be used. And yet, many of the applications being marketed for mental health problems have little to no empirical evidence supporting their effectiveness.

Given how mental health services had already been commodified as an industry before the creation of the internet, it is worth questioning the extent to which online mental health resources will prove useful for much more than data collection tools for purposes like marketing, at best, and surveillance and behavioral control, at worst.

“Many health care apps label themselves as wellness tools in their privacy policies or terms and conditions in an attempt to circumvent legislation that mandates privacy protections for user data, such as the Health Insurance Portability and Accountability Act,” Huckvale et al. explain.

They further note that such transparency issues have already spurred groups ranging from the US Food and Drug Administration, UK National Health Service, and World Health Organization to begin working towards basic standards for regulating how mental health apps operate.

Huckvale and colleagues caution that many obstacles remain to the proper oversight of data management protocols for mental health apps. Not only are their terms of agreement typically written in highly technical languages, but loopholes like those described above also make it difficult for researchers to assess what has been done with user data that has already been collected. And yet, the authors are optimistic that research such as theirs can contribute to greater transparency in how mobile apps marketed for mental health collect and distribute personal information.

For their study, Huckvale et al. analyzed the privacy policies and terms of conditions of 36 different mobile applications that were available for download on either the Android or iOS apple marketplaces in the United States and Australia on January 14, 2018. They narrowed their search to those matching keywords “smoking cessation” or “depression.”

After downloading the apps to 1 of 2 test devices, each app was tested twice with all network traffic intercepted through a common cryptography method called the man-in-the-middle attack. Here, the “destination and content of each transmission were tagged automatically to identify (1) the owner of the destination, whether developer or third party and (2) instances of personal and other user-generated data contained within each message.” The results were then compared with what the privacy policies and terms of conditions of each app said would happen with data collected on its users.

Through this analysis, Huckvale and colleagues found that 29 of the 36 (81%) apps sent data to Facebook or Google for data analytics and marketing purposes, even though only 17 of those 29 (59%) listed third-party sharing to these sites as a possibility in their policy documents.    

In total, 33 apps sent data to some third-party. According to the authors, of these 33, “9 (27%) sent a strong identifier consisting of either a fixed device identifier (8 apps) or a username (1 app); 26 of the 33 (79%) sent weak identifiers, such as an advertising identifier (24 apps), a pseudonymous key that can be used to track user behavior over time and across different products and technology platforms.” 2 of the apps collected and shared information about user-reported health conditions. No other personal information, like full names or birth dates, was found to be shared with any third party.   

Overall, Huckvale et al. note that “most apps failed to provide transparent disclosure of [data sharing] practices.” This is especially concerning given how little is known about what third parties, like Google or Facebook, in turn, do with mental health data shared with them. As they go on to explain:

“While Google explicitly limits the secondary uses of data collected for analytics and advertising or marketing purposes, Facebook’s developer policy states that ‘We can analyze your app, website, content, and data for any purpose, including commercial.’ Consequently, users should be aware that their use of ostensibly stand-alone mental health apps, and the health status that this implies, may be linked to other data for other purposes, such as marketing targeting mental illness. Critically, this may take place even if an app provides no visible cues (such as a Facebook login), and even for users who do not have a Facebook account.”

The study was limited by its sample size in that it only focused on top-10 apps on each platform and those specific to ‘depression’ or ‘smoking cessation,’ representing a small number of mental health apps available overall. With the authors only able to account for data sent directly to third parties, it is possible some app developers shared users’ data later in the process. As such, this research could represent a conservative estimate of how much data is ultimately collected and shared as a result of mobile mental health apps.

Smartphones are steadily evolving in ways that collect greater amounts of information across all areas of our personal lives. Considering the extent to which mental health has already become a global industry, the economic value of mental health data is only likely to increase as transnational companies like Facebook and Google develop new strategies for targeting users with the data they have already collected.

While Huckvale and colleagues express confidence that healthcare communities can work with researchers to develop more ethical practices around how mobile apps are recommended and used for mental health issues, such conversations will likely require levels of technical expertise and critical awareness that, unfortunately, extend well beyond the training of most health professionals today.

 

****

Huckvale K, Torous J, Larsen ME. Assessment of the Data Sharing and Privacy Practices of Smartphone Apps for Depression and Smoking Cessation. JAMA Network Open. 2019;2(4):e192542. doi: 10.1001/jamanetworkopen.2019.2542 (Link)

***

Mad in America hosts blogs by a diverse group of writers. These posts are designed to serve as a public forum for a discussion—broadly speaking—of psychiatry and its treatments. The opinions expressed are the writers’ own.

***

Mad in America has made some changes to the commenting process. You no longer need to login or create an account on our site to comment. The only information needed is your name, email and comment text. Comments made with an account prior to this change will remain visible on the site.

Previous articleFighting for the Meaning of Madness: An Interview with Dr. John Read
Next articleOn Human Nature and Its Implications for the Mind-Body Problem
Tim Beck, PhD
MIA Research News Team: Tim Beck is an Instructor in psychology at the University of West Georgia, where he earned a PhD in Psychology: Consciousness and Society. For his dissertation, he traced a critical history of the biomedical model of mental health, focusing on diagnostic representations of autism, and became interested in the power of self-advocacy movements to reshape conventional assumptions about mental suffering. In fall 2019, he will start a new position as Assistant Professor at Landmark College, where he will collaborate with students and faculty at their Center for Neurodiversity.

4 COMMENTS

  1. I have been concerned about this for a while. Even apps like Sleepio are concerning to me.

    I worked for Crisis Texting Line and quit very quickly. First of all, anyone contacting them is subject to mandatory reporting. Secondly, I did not want to be a volunteer bot. I was unimpressed with the training. Thirdly, this app collects data. In fact, they’re very proud of that, and include it in their advertising.

    There’s also some kind of therapist-for-hire that works via texting. The service is insanely expensive, $75 for a texting therapist? Huh? I tried this a long time ago and was unimpressed and asked for my money back.

    Report comment

LEAVE A REPLY