AI can crack open the 'black box' of effective mental health therapy at scale, study finds

As artificial intelligence advances in different areas of healthcare, there are concerns that technology and AI-based chatbots will replace the human connections between patients and practitioners.

But, a new study finds promising potential for AI and large language models to enhance mental health therapy at scale by being able to analyze millions of text-based counseling messages to shine a light on what works.

Researchers used AI to analyze more than 20 million text conversations of counseling sessions and successfully predicted patient satisfaction and clinical outcomes, according to a study published this week in the Journal of The American Medical Association (JAMA) Open.

It marks the largest-ever study of its kind, according to researchers, as they used AI models to evaluate more than 160,000 anonymized, text-based counseling sessions and 20 million total texts with millions of labeled utterances, setting a new benchmark in mental health research.

The goal of the study wasn't to evaluate the use of AI when providing mental health counseling or in place of therapists. But rather, researchers used AI models to examine how humans provide behavioral health care and the effectiveness of those interactions on patient satisfaction, engagement and clinical outcomes.

By using AI responsibly, researchers can unlock how the content of mental health conversations predicts key clinical and patient satisfaction outcomes, according to the study.

"What we want to do is use AI or machine learning to enhance the therapeutic work that is happening with a person and provide more tools, more exploration and more information about what's happening in the session to then build confidence, compassion and skills in therapists so that they're having better sessions with their clients."—Christina Soma, Ph.D., co-author of the study

A key finding of the study is that there is a direct correlation between empathetic counseling and enhanced patient satisfaction and clinical outcomes based on widely used indicators, researchers said. And, it turns out, AI can actually help to enhance the connection between therapists and patients.

Along with patient satisfaction and engagement with treatment, the study also looked at patient outcomes based on symptom reduction. The study findings indicate that supportive counseling ultimately led to better outcomes.

"You could describe it as kind of like happier care, longer care, better care, which are three very important outcomes," Derrick Hull, Ph.D., Talkspace's clinical research director, said in an interview. "It's the human element and the ability of the therapist to convey that warmth, empathy, genuine curiosity, and insight on the part of the therapist to help the patients move forward, that are most important. It suggests that AI can help therapists bring those skills more to the front."

Health tech company Lyssn.io, an AI-based quality assurance and clinician training platform, teamed up with online therapy company Talkspace on the study.

New York City-based Talkspace launched in 2012 and provides asynchronous, text-based therapy. The company covers approximately 113 million lives as of September 30, 2023, through partnerships with employers, health plans and paid benefits programs.

Talkspace's platform connects patients with a network of contracted and employed licensed mental health professionals, including psychologists, clinical mental health therapists, marriage and family therapists, social workers and psychiatrists.

For the research, Talkspace provided anonymized, consented patient data which was then analyzed by Lyssn's AI platform.

Text-based counseling has expanded access to mental health services for patients. But, as with traditional in-person treatment, the quality of treatment provided and patient outcomes can vary. Quality assurance is complicated by the lack of scalable methods for assessing therapy quality.

Historically, the evaluation of psychotherapy relies on observational coding systems that are labor-intensive, expensive at scale and often not used, according to the study researchers.

AI and large language models open up innovative new approaches to evaluating mental health therapy and the quality of conversations between clinicians and patients to find out what works best for improved outcomes, engagement and patient satisfaction. These findings can then be used to enhance clinician training, the study researchers say.

"With the emergence of digital health platforms, there's more health data available than really ever before and higher quality data than ever before. That's both an opportunity and a curse," Hull said.

"The opportunity is, yay, lots and lots of data. The problem is the way psychotherapy process research has typically been done is a bunch of professors and graduate students sitting in a room, laboriously going over transcripts, highlighting things, underlining things and classifying things. That's not going to work with nearly 21 million messages," he noted. "The opportunity to work with an organization like Lyssn that's invested so much in building AI tools that can classify particular types of interventions was super exciting to me. It's an opportunity to figure out, 'OK, how do we turn the lights on around what works in psychotherapy at a very large scale?' To avoid some of the downsides in the past where particular researchers have certain things they're looking for so they prioritize their efforts on those one or two interventions that they prize the most. It's very hard to look at all the interventions at once without some kind of technology tool."

Christina Soma, Ph.D., was one of those graduate students engaged in psychotherapy research. She is a post-doctoral research fellow at Lyssn and a co-author of the study.

Lyssn started as a university-based research project into how to use technology to analyze talk therapy and counseling sessions, David Atkins, Ph.D., Lyssn CEO and co-founder, told Fierce Healthcare back in October.

"What my research has focused on for the last 10 years is figuring out how do we move forward from being in a room highlighting things and trying to reach conclusions about a relatively small piece of data to moving forward to look at processes and moment-to-moment interactions," Soma said in an interview. "What does that say about outcomes and symptom changes? What does that say about the impact that the therapist is having on the interaction and different interventions that they're using on interactions? And then how do we incorporate different technologies like AI and machine learning?"

The study pairs Lyssn's AI models with Talkspace, an "organization that is doing community-focused research with thousands of people, millions of sessions and 20 million messages," Soma said.

"We get to use all of the technology that I've been working with for the last 10 years and we get to do this huge, large-scale study and provide meaningful feedback to therapists about what is happening in their sessions and try to crack open the black box of therapy, " she noted. "There are millions of moments that we can choose from that inform what's happening in therapy."

According to Hull, the study was 10 years in the making. During the past decade, Talkspace has expanded its message-based mental health platform and also has been engaged in research by tracking data on patient outcomes.

"We track outcomes data which mostly focuses on depression and anxiety symptoms. We also look at things like platform usage, how many messages are sent? And then there's the textual data itself, what's sent between the therapist and the patient," he said.

He added, "The textual data needs to be handled with great care. We invested many years in developing what are called 'scrubs' that will go in and de-identify the data, removing any names, places and dates and in that way, we can create a large dataset that's anonymous that can be used for research like this. We keep it on our servers so that it's secure and then allow encrypted access from the Lyssn team for the analyses to take place."

Researchers used Lyssn's AI models to assess the 20 million therapy text messages for 54 different interventions and content such as motivational interviewing and cognitive behavioral therapy.

"We also have what are called topic codes and these are very well-fleshed-out utterance-level codes that tell us what is happening in the conversation. Is there a discussion about substance use? Is there a discussion about family relationships? We have a lot of different utterance-level and topic codes that when the data goes through our developed AI models, then we get what's actually happening in the moment-to-moment and utterance level during these messages," Soma said. "We then linked those to this symptom and outcome data. So, how satisfied are people with treatment? How long are they engaged in treatment?"

After parsing through all those messages and data, the study findings suggest that components of supportive counseling, such as asking open-ended questions and making reflective listening statements, may be key factors in the success of asynchronous text-based counseling.

"Therapeutic skills that are the bread and butter of psychotherapy, so empathy, solid listening skills, like asking open-ended questions, and reflective listening statements, the things that therapists are doing to connect with clients, the findings show that those were related to positive outcomes, which was higher satisfaction and then longer engagement and treatment," Soma said. 

The study also enabled mental health researchers to conduct a large-scale exploration of a big dataset using AI, she noted.

"For me, that replaces the thoughts that folks are having that AI and mental health are only being used to create chatbots, and it's going to take the jobs of therapists and people think they are going to be doing therapy with a robot instead of with a real person," she said. "This changes the conversation to say that what we want to do is use AI or machine learning to enhance the therapeutic work that is happening with a person and provide more tools, more exploration and more information about what's happening in your session to then build confidence, compassion and skills in therapists so that they're having better sessions with their clients. And that could lead to clients sticking around longer during their therapeutic process because they feel more connected to their therapist."

Bedside manner is important when addressing physical health issues, such as a patient visit with a primary care doctor or specialist. But whether a medication prescribed to the patient is effective does not depend on the doctor who prescribes it, Hull noted.

Within the mental health community, there are conversations and studies on which therapeutic technique is most effective. The study results indicate that in psychotherapy, "the people in the room matter a great deal to the outcome," he said.

"I think these data, at a very large scale and with the depth of all the different codes that are being looked at, really helps to show that the human element is so important," he said. "The structural implication of this study is that humans still matter and AI can help."

The study findings also open up opportunities to use AI to advance mental health clinician training, Hull said.

"With therapists, this data helps to suggest and also offer very specific opportunities for training around how to maintain our humanity and that warmth and empathy even as we're deploying these evidence-based techniques, which I think is very, very exciting. I think tools like this may help to accelerate, refine and improve training opportunities for therapists," he noted.