Suicide Hotline Left Ethics Board Out Of The Loop About Data-Sharing With For-Profit Spinoff Ali_Lev
that insights gleaned from nearly 200 million of Crisis Text Line’s messages had helped the company develop AI-powered customer service tools that are aimed at assisting customer service agents in live chats with upset customers.
“We utilize our extensive experience working through the most challenging conversations in the crisis space to create a machine-learning based software platform that helps companies handle their hard customer conversations with empathy,” Loris says on “Someone seeking help in a crisis shouldn't have to worry about their data being sold for a giant corporation's profit.”Yet independent privacy and ethics experts expressed concerns. Some argued that Crisis Text Line may not have gotten meaningful consent from texters — who they said were unlikely to read some 50 paragraphs of disclosures in the midst of an emergency — and that making commercial use of this particular data, even if anonymized, was wrong.
that Crisis Text Line and its for-profit spinoff end their “disturbingly dystopian” sharing of sensitive mental health data. And though he welcomed their eventual decision to cease data-sharing, Carr said in an email tolast week that “questions remain regarding their current practices” and that as a result, “my office is continuing to meet with Crisis Text Line leadership to ensure that its data practices comply with the law.