Crisis Text Line Shares Data With For-Profit Spinoff, Raising Ethical Questions

0
688

From Politico: “Crisis Text Line is one of the world’s most prominent mental health support lines, a tech-driven nonprofit that uses big data and artificial intelligence to help people cope with traumas such as self-harm, emotional abuse and thoughts of suicide.

But the data the charity collects from its online text conversations with people in their darkest moments does not end there: The organization’s for-profit spinoff uses a sliced and repackaged version of that information to create and market customer service software.

Crisis Text Line says any data it shares with that company, Loris.ai, has been wholly ‘anonymized,’ stripped of any details that could be used to identify people who contacted the helpline in distress. Both entities say their goal is to improve the world — in Loris’ case, by making ‘customer support more human, empathetic, and scalable.’

In turn, Loris has pledged to share some of its revenue with Crisis Text Line. The nonprofit also holds an ownership stake in the company, and the two entities shared the same CEO for at least a year and a half. The two call their relationship a model for how commercial enterprises can help charitable endeavors thrive.

. . . But [Jennifer King, a privacy and data policy fellow at Stanford], called it ‘ethically questionable’ to make commercial use of this kind of data — even if it’s anonymized — given the emotional stress that people are under when presented with a link to terms of service they may never open. (‘By texting further with us, you agree to our Terms,’ says the automated first message.)

‘We’re seeing more and more how often data online is not just my shopping history; it’s a real glimpse into my psyche,’ King said. ‘These are people at their worst moments. Using that data to help other people is one thing, but commercializing it just seems like a real ethical line for a nonprofit to cross.'”

Article →

***

Back to Around the Web

LEAVE A REPLY