Northwell invests in Hume AI's nonverbal voice assessment with an eye toward understanding pain, depression

Northwell Holdings today announced investment in Hume AI in order to support the development of machine learning tools to better understand vocal and nonverbal communications.

The strategic investment of $3 million by the venture capital branch of Northwell Health aims to meet the increasing demand for tools to understand language and nonverbal expression in healthcare. Natural language understanding (NLU) tools are one of many digital horizons expanding in the healthcare space.

What Hume and NLU offer beyond a simple analysis of vocabulary, like in natural language processing, is an assessment of tune, rhythm and timbre in a patient’s speaking voice and nonverbal expressions such as laughter, sighs and gasps, according to the company.

“With Hume, what we are so excited about is this breakthrough technology, a tool that is going to be able to advance the way that we deliver care and perhaps conduct operations throughout clinical and business enterprises,” said Rich Mulry, CEO and president of Northwell Holdings. “The other huge factor for us in deciding to invest was the broad evidence-based population upon which the algorithms are built. The company went to extensive lengths to obtain data in multiple countries throughout the world, so we believe that is going to help create a much stronger product that's less prone to bias or error.”

Hume works principally as a measurement tool gauging vocal expressions like laughs, speech impediments and interjections. Hume can similarly assess facial expressions. As a research-first company, it works with scholars like Daniel Barron, M.D., Ph.D., faculty member at Harvard Medical School and author of "Reading Our Minds: The Rise of Big Data Psychiatry," to unlock the secrets of nonverbal communication.

Hume founder and CEO Alan Cowen, Ph.D., sees the artificial intelligence and machine learning company as providing a catalyst for broad growth within healthcare and beyond via research and development partnerships. He foresees the tool helping create more accurate diagnoses and subdiagnoses, leading to more precise screening of patients, matching with the right healthcare providers sooner, tracking treatment progress, predicting health crises and creating better speech aids for the speech impaired.

“But these are very translational areas of application, and there are already companies and labs and technologies available to try to do these things; but right now, they're just using large language models,” Cowen said. “Where we can help is by providing the other information that's absent from large language models, which is nonverbal expression. Where there's metrics of treatment progress of patient well-being and health, we can see if our technologies predict those metrics first and then provide potential interventions that will enable applications to improve those outcomes.”


Methods of Hume’s AI and ML emotion assessments
 

Hume provides universal measurement of the movement of specific facial muscles and nonverbal sounds. Partners using the tools can then link those measurements to self-reported outcomes in different patient populations.

Hume is currently running global experiments where subjects tie their own emotions to measured auditory and facial outputs. The company’s research currently reflects populations in the U.S., India, China, Venezuela, South Africa and Ethiopia.

With purposeful, broad experimentation, Cowen believes the Hume algorithm ignores “confounding factors” that would otherwise be present in AI trained on perceptual ratings of limited expressions of data scraped from the internet. Cowen says the addition of nonverbal cues allows for subtlety and variance “as opposed to kind of a remote reductive way, which has been the trend until recently.”

Prior technologies reflect older models of emotional expression, Cowen says, focusing on the expression of six mutually exclusive emotions—happiness, sadness, anger, disgust, fear and surprise. Chief scientific adviser of Hume, Dacher Keltner, Ph.D., and Cowen have published scholarship positing more nuanced theories of human expressive behavior.

Coined as semantic space theory, expressions of complex patterns in language, speech prosody, nonverbal vocalizations like sighs and chuckles and precise movement in facial muscles show emotions beyond the “basic six.”

Hume breaks down emotions into 53 mutable categories. In assessing vocalizations, a patient’s sigh can reveal relief, realization, tiredness and pain or their gasp may show fear, horror and distress broken down by percentages. The tool also accounts for cultural variance.  

In the journal Nature, Hume colleagues assessed 16 facial expressions that occur in similar contexts worldwide through analyzing 6 million videos from 144 countries with the Hume AI platform. The study found that facial expressions were 70% preserved across 12 world regions.

“We publish on what's shared and what's different, what’s unique to specific cultures so that people know they need to fine tune the models if they're going to deploy it in a particular population,” Cowen said. “What we do first is provide the measures of facial movements; regardless of culture, your face still has the same muscles. And the same is true for voice, we all have the same articulators. We can provide an objective representation or measurement of those things, but then it's extremely important, in the population that you're serving, to link those measurements to specific self-reported outcomes, health outcomes, etc., and do that additional research on top.”


Healthcare applications of nonverbal AI assessment
 

Once an objective measurement of the behavior is collected, research partners can help find expansive uses of the data. Researchers like Barron, who also holds the title of director of the pain intervention and digital research program at Brigham and Women's Hospital, can use the data to nail down subjective measurements like pain.

“Right now, there are two ways that physicians gauge the degree of a patient's pain or fear: rating scales, which are very subjective, and through nonverbal indicators,” Cowen said. “Physicians rest a lot on nonverbal indicators, but they have no way of going back and saying, ‘This was the measurement.’”

Northwell’s funding will be used by Hume to refine machine learning models for health applications, such as clinical research, patient screening and accessibility technology.

NLU is already being adopted in healthcare settings with the goal of better understanding pain, major depression and cognitive impairment. Without the addition of nonverbal assessment, current NLU are especially limited in a healthcare setting.

“With nonverbal communication in mind, there's a number of populations within healthcare who can benefit from this technology: people with autism or other neurological disorders, ALS,” Mulry said. “We just have patients across the board where we may gain insights into really how we can better address their care needs.”

This summer, Northwell Health penned a partnership with Google Cloud to leverage its AI and ML capabilities to develop novel predictive analytic capabilities. Northwell said they planned to use the tech to enhance digital scheduling, automated payer interactions and offer patients intelligent summaries of medical information.

“Because of Northwell’s size and commitment to innovation, we're particularly well suited to work with Hume, especially at this early stage,” Mulry said. “We're willing to try to be a strategic partner and help prove those use cases. We have a large diverse patient population; we have a large research enterprise. By coupling those two and working with Hume, we would be able to advance and validate those use cases so they could end up in production.”

Just as with Hume, Northwell hopes Google AI can offer providers diagnostic support. With a diverse population, the health system has emphasized the unique importance of technology that can facilitate equitable care while also protecting patient privacy.

As New York state’s largest healthcare provider and private employer, Northwell believes it is a strong partner to develop patient, provider and payer trust in AI and Hume’s advancements.

“I think clinicians welcome the added insight AI can provide, but they want to be able to see that this has been tested rigorously and it's going to first 'do no harm' and then secondly, help advance patient care,” Mulry said. “So having clinicians as part of the solution to build it makes Hume’s AI an enormously more powerful solution. It accelerates the time to market, and I think it helps validate to the clinical communities that physicians have been involved and that we've addressed many of the concerns that they might have.”