Article by Emma Bedor Hiland: “Crisis Text Line was supposed to be the exception. Skyrocketing rates of depression, anxiety, and mental distress over the last decade demanded new, innovative solutions. The non-profit organization was founded in 2013 with the mission of providing free mental health text messaging services and crisis intervention tools. It seemed like the right moment to use technology to make the world a better place. Over the following years, the accolades and praise the platform received reflected its success. But their sterling reputation was tarnished overnight at the beginning of 2022 when Politico published an investigation into the way Crisis Text Line had handled and shared user data. The problem with the organization, however, goes well beyond its alleged mishandling of user information.
Despite Crisis Text Line’s assurance that its platform was anonymous, Politico’s January report showed that the company’s private messaging sessions were not actually anonymous. Data about users, including what they shared with Crisis Text Line’s volunteers, had been provided and sold to an entirely different company called Loris.ai, a tech startup that specializes in artificial intelligence software for human resources and customer service. The report brought to light a troubling relationship between the two organizations. Both had previously been headed by the same CEO, Nancy Lublin. In 2019, however, Lublin had stepped down from Loris, and in 2020 Crisis Text Line’s board ousted her following allegations that she had engaged in workplace racism.
But the troubles that enveloped Crisis Text Line can’t be blamed on one bad apple. Crisis Text Line’s board of directors had approved the relationship between the entities. In the technology and big data sectors, commodification of user data is fundamental to a platform or toolset’s economic survival, and by sharing data with Loris.ai, Crisis Text Line was able to provide needed services. The harsh reality revealed by the Politico report was that even mental healthcare is not immune from commodification, despite the risks of aggregating and sharing information about experiences and topics which continue to be stigmatized.
In the case of the Crisis Text Line-Loris.ai partnership, Loris used the nonprofit’s data to improve its own, for-profit development of machine learning algorithms sold to corporations and governments. Although Crisis Text Line maintains that all of the data shared with Loris was anonymized, the transactional nature of the relationship between the two was still fundamentally an economic one. As the Loris.ai website states, “Crisis Text Line is a Loris shareholder. Our success offers material benefit to CTL, helping this non-profit organization continue its important work. We believe this model is a blueprint for ways for-profit companies can infuse social good into their culture and operations, and for nonprofits to prosper.”…(More)”.