A new study, published in the International Journal of Law and Psychiatry, investigates the mental health app marketplace. The researchers, led by Lisa Parker at the University of Sydney in Australia, concluded that the app industry is not paying sufficient attention to users’ privacy and advocate for greater regulation and enforcement in the industry.
“Digital mental health services are increasingly endorsed by governments and health professionals as a low cost, accessible alternative or adjunct to face-to-face therapy,” the authors write. “App users may suffer the loss of personal privacy due to security breaches or common data-sharing practices between app developers and third parties. Loss of privacy around personal health data may harm an individual’s reputation or health.”
Experts have expressed concern about the potential loss of privacy resulting from the use of mental health apps. Mental health information is highly sensitive, and privacy breaches have significant repercussions beyond identity theft and healthcare system fraud. Poor data security on these apps can lead to emotional harms, heightened anxiety, exploitative advertising targeting, and social impacts, affecting an individual’s credit rating, employment, and housing.
Numerous studies are emerging, raising concerns about mental health apps when it comes to using evidence-based practices and in transparency about how user data is shared.
While we may have become accustomed to hearing about high-profile, malicious security breaches and hacks, users are often less aware that common commercial data-sharing practices pose significant risks. Practices like sharing data with third parties for purposes unrelated to app use and legal strategies for monetizing apps may seem mundane or technical.
On Facebook’s model, an app may be offered for free to users but embed an advertising library that displays in-app ads. The problem is that these libraries are granted data permissions and are not bound to protect user privacy. While data is often shared in an anonymized format, it can be cross-linked with data from other sources that are not anonymous.
Poor privacy protection for app users has more obvious consequences for individuals, like eerily targeted ads, but less apparent and rather serious consequences for the public. Data aggregated from many people across many sources are used to study societal patterns and manipulate broad-scale behavior.
Third parties frequently sell proprietary data analytic and rating software to banks, landlords, universities, and employers to make inferences about individuals. What’s worse is that awareness about lack of privacy has a bittersweet deterrent effect: the value that these services offer isn’t realized when these concerns go unaddressed, and users have little guidance on how to protect themselves.
Through a critical content analysis of promotional materials, privacy policies, and data sharing practices, the authors sought to identify important issues related to privacy in the mental health app market that, in turn, affect users’ reputations and health. These findings can then be used to advocate for greater privacy regulation, enforcement of existing regulations, and to advance other user interests.
Results were evaluated against national (AUS) and international policies used to regulate health app privacy, with particular attention given to the guidance set out in their local Australian Privacy Principles.
They found that apps frequently requested permission to access elements of the user’s mobile device, including requesting so-called ‘dangerous,’ higher risk permissions. In 2017, the 50 apps that provided permissions information requested an average of 6.4 (range 1-14) different permissions. These included permissions that enable the app to access a user’s private information, alter a user’s stored data, or interfere with other apps’ operation.
The two most common ‘dangerous’ permissions requested in 73% (37/51) of the 50 apps were to read and modify or delete the device’s USB storage. While required to save data directly to a device, they also enable developers to read and modify/delete all stored files on the device, including files containing personal information such as photos, contacts, and text messages.
“Some apps requested permissions which were seemingly unrelated to the app’s main purpose as communicated to users. For example, Happify, an app providing ‘guided relaxation/meditation’ audios and ‘science-based activities & games’ requested 18 permissions, including permissions for access to users’ text messages and contacts list.”
Many apps explicitly encouraged users to share their stories and profiles with an online community, sometimes (falsely) advertising a more secure and less judgmental sharing environment than major social media platforms.
Perhaps the most worrisome is that nearly half of the apps (25/61, 41%) did not have a privacy policy to inform users about how and when personal information was collected, retained, or shared with third parties. Privacy policies are standard recommendations for basic user protection. Yet, they were left out even by apps recommended by government agencies, including the U.S. Department of Defense.
Only one app of the 61 surveyed met the minimum standards of the Australian Government (though these are quite similar across the world). Such minimum standards for mobile apps include readability and accessibility of the policy, information on the collection, use, sharing, security, and guidance on how to complain about unsatisfactory privacy practices.
The apps with privacy policies were often difficult to read and understand, with word counts averaging 2,000 words and only three to four containing readability features like summaries and lay language. One-quarter of the sampled apps did not use subject headings to facilitate navigation of the policies or facilitate the policy’s discovery.
Readability and accessibility issues aside, nearly 90% of the privacy policies mentioned the type and purpose of collected data. Over three-quarters of them also mentioned disclosure to third parties. About half mentioned how these parties might use their data (e.g., processing, analysis, IT services, fundraising, web development, market research, and medical consultants who provide services).
Most policies contained disclaimers relieving them of any responsibility from security and privacy-oriented harms, especially once data is passed on to third parties. In fact, many app policies “also referred to user data as a business asset that could be transferred to other companies in the event of acquisition” (p. 202). Needless to say, policies offer no guidance to users on how to mitigate the effects of lost privacy. With no promises to alert users to privacy practice changes either, those who rely on mental health apps are on their own in navigating potential issues and concerns.
These findings add to the emerging empirical work showing that mental health apps are neither private nor secure, which may adversely impact users’ well-being. The authors urge major players in the tech industry, app stores, and developers to commit to paying more attention to protecting mental health app users’ privacy. The authors also advocate for increased monitoring and enforcement of privacy principles and practices in mental health apps (and the mobile ecosystem as a whole).
****
Parker, L., Halter, V., Karliychuk, T., & Grundy, Q. (2019). How private is your mental health app data? An empirical study of mental health app privacy policies and practices. International Journal of Law and Psychiatry, 64, 198-204. (Link)
Important info for people who turn to “mental health”.
The most unhealthy environment is within that industry
that promotes it and lines their pockets with.
We should stay away from ANYTHING that uses the word
“mental health”. The word itself is the web.
Report comment
I just read that myself somewhere, someone’s “mental health” and then they say he didn’t reach out for help.
Here: ‘He struggled with ethical problems in medicine, his growing depression and corona isolation. He may not have asked loudly enough for the needed help.’
From this article. https://slippedisc.com/2020/10/violinist-who-became-doctor-takes-his-life-during-covid/
And then the issue of ethical problems in medicine is brought up…
And in “mental health” is there an ethical issue regarding practice, or that there’s more recovery when there isn’t any…. um practice, that is
Report comment