Skip to content

Therapy app Talkspace mined user data for marketing insights, former employees allege

A report from the New York Times raises questions over how the teletherapy startup Talkspace handles user data.

Key Takeaways
  • In the report, several former employees said that "individual users' anonymized conversations were routinely reviewed and mined for insights."
  • Talkspace denied using user data for marketing purposes, though it acknowledged that it looks at client transcripts to improve its services.
  • It's still unclear whether teletherapy is as effective as traditional therapy.

A recent report raises questions over how the mobile therapy company Talkspace handles user data.

Launched in 2014, Talkspace is an app that provides its users access to licensed therapists through text, audio, and video. To access therapy, users fill out a questionnaire to get “matched” with a therapist, and then, for prices starting at about $200 per month, they begin chatting with their therapist through whichever medium they prefer.

This model can help make therapy more accessible for people short on time or money. But according to a new report from The New York Times, former employees say that “individual users’ anonymized conversations were routinely reviewed and mined for insights,” and that Talkspace data scientists “shared common phrases from clients’ transcripts with the marketing team so that it could better target potential customers.”

The company denied using client conversation transcripts for marketing purposes, but did acknowledge that “data science and clinical leadership will from time to time share insights with their colleagues.”

Former employees also questioned the legitimacy of certain interventions by the company into client-therapist interactions. For example, after one therapist sent a client a link to an online anxiety worksheet, a company representative instructed her to try to keep clients inside the app.

“I was like, ‘How do you know I did that?'” Karissa Brennan, a therapist who worked with Talkspace from 2015 to 2017, told the Times. “They said it was private, but it wasn’t.”

Other former employees said the company would pay special attention to its “enterprise partner” clients, who worked at companies like Google. One therapist said Talkspace contacted her for taking too long to respond to Google clients.

Talkspace responded to the Times with a Medium post, which claimed the Times report contained false and “uninformed assertions.”

“Talkspace is a HIPAA/HITECH and SOC2 approved platform, audited annually by external vendors, and has deployed additional technologies to keep its data safe, exceeding all existing regulatory requirements,” the post states.

However, if the claims in the Times report are true, Talkspace may have violated the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule, which prohibits providers from disclosing patients’ medical data for marketing purposes, unless the patient gives authorization.

“If it is true that Talkspace used information from private therapy sessions for marketing purposes, that is a clear violation of trust with their customers,” Hayley Tsukayama, Legislative Activist from the Electronic Frontier Foundation, told Salon. “All companies should be very clear with their customers about how they use personal information, make sure that they don’t use information in ways that consumers don’t expect, and give them the opportunity to withdraw consent for those purposes on an ongoing basis. Talkspace trades on its trustworthiness and mentions privacy frequently in its ad campaigns. Its actions should be in line with its promises.”

(It’s also worth noting that Talkspace recently threatened legal action against a security researcher who wrote a blog post outlining the potential discovery of a bug that allowed him to get a year’s subscription for free. A report from TechCrunch notes that Talkspace rejected the findings, and that the company does not offer a way for researchers to submit potential security bugs.)

Beyond privacy concerns, the report also raises questions about the efficacy of teletherapy, especially within a corporate model.

“The app-ification of mental health care has real problems,” Hannah Zeavin, a lecturer at the University of California and author of an upcoming book on teletherapy, told the Times. “These are corporate platforms first. And they offer therapy second.”

The main problem with judging the efficacy of teletherapy is the lack of solid research — it’s too new to comprehensively compare it with in-person therapy. Still, some studies suggest it could be useful for at-risk populations, or for people in the wake of a disaster.

But others remain skeptical.

“Maybe [teletherapy] products and services are helpful to certain people,” said Linda Michaels, a founder of the Psychotherapy Action Network, a therapists advocacy group. “But it’s just not therapy.”

Proper therapy or not, it’s worth considering how platforms like Talkspace use — and possibly even depend on — user data. In a 2019 opinion piece published in the Times, Talkspace co-founder Oren Frank wrote:

“The vast amount of information each of us possesses is far too important to be left under the control of just a few entities — private or public. We can think of our health care data as a contribution to the public good and equalize its availability to scientists and researchers across disciplines, like open source code. From there, imagine better predictive models that will in turn allow better and earlier diagnoses, and eventually better treatments.

Smarter faster: the Big Think newsletter
Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

Your health care data could help people who are, at least in some medical aspects, very similar to you. It might even save their lives. The right thing to do with your data is not to guard it, but to share it.”

Would you?


Up Next