There are thousands of digital mental health apps on the market that offer tools for stress relief, meditation guides and services to help with more serious conditions like anxiety. Most of them promise to help users feel happier and healthier.
Some experts are concerned about the quality standards with many of these apps and the claims that they make about the effectiveness of their services.
Federal regulators have moved against digital health tools that make bold health claims.
In 2016, Lumos Labs, the company behind the Lumosity “brain training” program, agreed to a $2 million settlement with the Federal Trade Commission around charges alleging deceptive advertising of its games. The company had claimed that Lumosity games can reduce or delay cognitive impairment associated with age and other serious health conditions.
There are also data privacy and security concerns with digital health tools in general and this problem is more alarming with mental wellness apps.
“A lot of the security and privacy that you would expect when talking to a psychiatrist may not apply to these apps,” John Torous, M.D. director of the digital psychiatry division at Beth Israel Deaconess Medical Center in Boston told Fierce Healthcare. “Information such as if you feel suicidal or what medications you are taking, it can be hard to know where the data is going and who has access to it.”
A Consumer Reports investigation evaluated seven popular mental health apps to see what happens to users’ personal information when they start using the apps. Researchers found that many of the companies’ privacy policies don’t always make it clear what kind of data could be shared, and how it could be used.
Consumers have an understandable mistrust of apps and potential data misuse so mental health companies need to set a high bar on privacy and security and be transparent about how data is used, said Connie Chen, M.D., chief medical officer at digital mental health company Lyra Health.
“One of CEO David Ebersman’s first executive hires was a chief information security officer. From day one, Lyra Health has, from an internal controls and processes perspective, re-emphasized that client data is sacred and protected at all costs. That data should be segmented and the fewest possible people, only those with absolutely need and documented justification, have access to more sensitive pieces of data,” she said.
To help consumers, clinicians and other stakeholders make informed choices, Torous and a team of researchers launched a searchable online database, called MIND (M-Health Index and Navigation Database), that evaluates apps based on cost, features, privacy policies and scientific evidence. That database draws on an evaluation framework that has been endorsed by the American Psychiatric Association.
Another app evaluation site called One Mind PsyberGuide reviews apps against rating criteria developed by experts in the field. The organization also offers a guide to digital mental health apps for employers, which it developed in collaboration with the Northeast Business Group on Health.
Despite these efforts to offer more guidance, Torous, also an associate professor of psychiatry at Harvard Medical School, believes there needs to be more oversight of digital mental wellbeing apps.
“The space is so chaotic with concerns about privacy and lack of evidence,” he said. “The FDA (Food and Drug Administration) has put out guidance saying that for a lot of mental health apps, it’s going to use enforcement discretion. If the risk appears low, the FDA is not going to enforce or chase these companies. There are so many apps, it’s not feasible for them to keep up with all of them.”
“I think an easy starting point could be around the types of claims that companies make and making sure that they are backed by data and meaningful measurement,” Chen said. “And this is especially important in the consumer space, where it’s more of a 'wild, wild West.'"
Digital mental health companies need to be held to strong standards with research and outcomes but regulatory oversight needs to balance the “art and science” of mental health care, said Myra Altman, Ph.D., a clinical psychologist and the vice president of clinical care at digital mental health company Modern Health.
“Where it becomes risky is if we try to over-medicalize mental health models and say this has to fit into exactly what we would see in a diabetes-type program, where can we distill this down into an FDA-approved medication. I don’t know if that always makes sense in a mental health context,” she said.
She added, “We need to make sure there are rigorous evidence-based approaches but they are also flexible to meet the needs of an individual patient.”
Oversight would be beneficial but hard to enforce, said Adam Chekroud, co-founder and chief product officer at employee mental health platform Spring Health.
“Ultimately, in the long run, the companies that thrive will do so by showing that their product is real and delivers real value. Regulatory oversight would increase the pressure on companies to accelerate this trend. I think it would be a good thing for the space and an even better thing for patient safety,” he said.
In a recent blog post, Hunter Walk, partner at venture capital firm HomeBrew, called for entrepreneurs in the mental wellness space to have their own version of the Hippocratic Oath. Investors in these companies should push for responsible growth and make sure patients are well-served, he said. And companies should have a plan for client offboarding if the company doesn’t succeed, Walk said.
“There is a lot of saturation in the marketplace with companies offering very similar products. Eventually, these companies will have to differentiate themselves not based on claims but based on results,” Torous said. “As COVID-19 accelerates the demand for digital mental health services, companies will need to separate themselves based on who has high-quality evidence that it works.”