Researcher Warns of Abuse and Coercion With ‘App-ification’ of Mental Health Services

Mental health apps may offer increased accessibility but unregulated private sector involvement could lead to abuse of power and coercion.


In a chapter that will be included in the Handbook of Mental Health Law, researcher Piers Gooding argues that there is a need for new safeguards to regulate technologies such as remote mental health assessment and mental health apps, as well as the increasing involvement of private companies in mental health systems.

While these new developments can be positive in terms of providing increased access to mental health services, Gooding cautions that the economic and political interests involved in the “app-ification” of mental health could create an environment where abusive coercion could occur. In other words, without regulation, private companies may exploit the power differential between mental health providers and consumers for profit.

“The chapter examines the legal issues raised by the digitization of mental health legislation and takes a political economy perspective to consider the role of the private sector in the emerging configurations of digitized health and social services,” Gooding writes. “The chapter recommends the implementation of safeguards in both the procurement and commissioning of private sector practices related to mental health crisis work, as well as in the proliferation of digital platforms in health and social care services.”

This chapter aims to explore the digitization of mental health from a policy perspective and identify necessary regulations. To achieve this goal, the authors focus on three types of technology used in the mental health system of England and Wales. These include using digital platforms for mental health assessments involving involuntary intervention, video call technology to authorize intervention under the Mental Health Act 1983 (MHA), and remote video hearings of mental health tribunals.

In 2018, the British government began the digitization of the MHA to streamline treatment and assessment and give mental healthcare consumers greater access to information about the MHA, their rights, safeguards, and the treatment process. Shortly after this recommendation, many private companies created online tools for use in mental healthcare. The authors focus on two such companies, S12 Solutions, and Thalamos. S12 Solutions created a system by which social workers, psychologists, etc., could communicate almost instantly with authorized mental health practitioners (AMHP). Using the same platform, AMHP could recommend involuntary treatment remotely and submit payment claim forms for their services. S12 is currently used by 75% of England’s Mental Health Trusts. The chief executive of one National Health Service trust described S12 Solutions as “the Uber of finding doctors for the health service.”

During the Covid-19 pandemic in 2020, the MHA was amended to allow for digital reporting of statutory forms, meaning the entire process of involuntary confinement could be accomplished remotely. Thalamos, a private company similar to S12 Solutions, noted in reference to this weakened regulation that “we even helped change the law.”

A study conducted in England and Wales revealed that proponents of digitization point to 3 benefits of using apps like s12 Solutions in mental healthcare: it provides information about doctors and their specialties, it makes payments to these doctors easier, and it reduces the AMHPs workload.

Digitization detractors point to several issues with the system. Internet dependence could limit its usefulness. It may make AMHP less likely to involve doctors that know the patient well, instead choosing to use the most convenient doctor available to them through the app. Some AMHPs and doctors may be reluctant to use the app, choosing to use known and trusted sources instead. Technical problems could slow the system.

During the beginning of the Covid-19 pandemic, practitioners in England and Wales began using video call assessments to authorize involuntary treatment for mental health patients, despite provisions of the MHA qualifying that patients must be “personally seen” and “personally examined” by AMHP and doctors. In 2021 the High Court of England decreed that video calls did not meet the “personally seen” and “personally examined” requirements laid out in the MHA.

The British Association of Service Workers surveyed AMHPs in 2020 while video calls were still being used to adjudicate involuntary treatment. The survey revealed three main benefits of the remote system: less risk of Covid-19 infection, the ability to conduct assessments over large distances, and access to patients residing in care homes that did not allow visitors. The survey also revealed concerns from many AMHPs about using this technology. Some practitioners may use the remote assessments for convenience rather than necessity, and non-verbal cues and behaviors can not be adequately assessed remotely, hampering effective communication.

Video conferencing has also been used in tribunals to authorize involuntary treatment. While research has shown that factors like discharge rates seem unaffected by remote hearings, individual patients may still be marginalized. For example, people with privacy concerns may be unwilling to participate in hearings conducted by phone. Many practitioners and patients also experience technical issues during remote calls. About 60% of respondents in one survey indicated they had experienced technical problems while using remote systems.

With the fast rise in technology in the mental health field and the corresponding influx of private, for-profit businesses, some practitioners have raised concerns such as the sharing of personal medical information and the inability of current regulations to ensure accountability in an increasingly technology-infused field.

There is also a lack of precise roles for private companies in using these technologies, which could lead to for-profit abuse of vulnerable populations. The authors give the example of Serenity Integrated Mentoring (SIM) to demonstrate how these technologies can harm service users. SIM was a program run by police and public mental health services in England. This program, instituted by a private company, identified some service users as “high-intensity users.” Once a person was identified as a high-intensity user, they could be denied care, prevented from seeing psychiatrists, and prematurely discharged. Sometimes, these users could be issued warnings to stop calling for help, self-harming, or face prosecution.

In other words, this program criminalized some instances of acute psychiatric crises. Despite its potential for abuse and patient harm, the SIM program was instituted by a private company with little to no assessment of its impact on patient safety. The assessments used to justify the SIM program were based almost entirely on its cost-effectiveness.

Gooding suggests that policymakers need to pay special attention to the safety of service users and the accountability of businesses and providers in the increasingly technology-infused mental health field.

They conclude:

“Global economic downturns and fiscal constraints will increase pressures on health systems to minimize costs, to which technological solutions will be invariably proposed. A common appeal lies in expected efficiency gains and improved services for those who want or are deemed to need them. It is hard to object to cutting down wait times in service provision for those in extreme crisis or allowing a person to seek advocacy or access remote tribunal hearings where it suits them, including through high-quality internet facilities in acute public mental health settings. At the same time, many of these aims are tied to economic and political interests concerned with cost minimization, privatization, and the generation of capital through data accumulation and the uptake of apps, driven by expectations of the economic value attributed to platforms and data.”

Research has found that mental health apps likely lead to overdiagnosis and lack an evidence base for their effectiveness. These apps also fail to protect user privacy and are not transparent in their data-sharing policies.



Gooding, P. (forthcoming), ‘”Digitising the Mental Health Act”: Are we facing the app-ification and platformisation of coercion in mental health services?’ In M. Donnelly and B. Kelly (eds) Handbook of Mental Health Law (Routledge).