Digital health applications have grown significantly in both number and ambition over the last several years. Yet despite the promises these companies make, few have any real evidence of their clinical effectiveness.
While companies do regularly perform studies on their apps, these studies rarely used randomized controlled trials, depended on small study populations and generally focused on healthier individuals, according to a study published in Health Affairs Monday.
"We had a fairly high bar for demonstrating evidence—but in many ways in medicine it's the standard bar, which is clinical trials of sorts—and right now our sense is that the digital health community doesn't really have to meet that bar, particularly when they're going direct to consumer," Adam Cohen, one of the authors of the study, told FierceHealthcare in an interview.
The main thing the authors were looking for in these apps was evidence of clinical effectiveness; that is, demonstrable improvement for a relevant health diagnosis. So far, digital health firms have merely verified their own measurements.
A company may be able to prove, for example, that an activity tracker can accurately count steps. But that's a far cry from proving it can help a patient struggling with their weight.
"You know, okay, it tracks steps or it measures heart rate, but does it prevent obesity? Or does it reduce obesity? Or does it prevent events or conditions related to obesity, like stroke or diabetes and so forth? And that sort of impact—we didn't find a ton of evidence for that in our selected cohort," Cohen said.
Despite the lack of studies so far, the authors were optimistic that companies would begin testing for clinical effectiveness in the coming years. While digital healthcare companies clearly haven't entered that phase just yet, they have expressed interest in running clinical trials and helping high-burden populations, the study said.
Furthermore, if they want to move beyond the direct-to-consumer model and enter the formal healthcare space, digital health companies will need to better demonstrate their effectiveness. Clinicians have a much higher need for proof than consumers, said Kyan Safavi, another author of the study, so companies will need to step up their studies if they want to convince providers.
Meanwhile, some aspects of digital health products have lots of potential in healthcare—which may explain why up to 80% of Americans have used one.
"There are some features of digital health that really lend themselves to optimism. For example, the ability of digital health products to go from development to market quickly, and to iterate quickly improvements to the product, and at low cost, is very helpful. And it stands in contrast to typical medical devices and pharmaceuticals," Safavi told FierceHealthcare.
Ultimately, the authors said it would be up to companies to take the lead on improving clinical trials, but they did have some policy recommendations as well.
Through regulation, policymakers can help make it "more obvious for companies" what kinds of clinical proof they need to provide. The authors also recommended that the government could provide financial incentives if they saw enough need in the space.
"For products that have general use for a practice or hospital that increases coordination of care, communication and patient engagement across many patients or populations, then maybe a direct financial incentive from the government—similar to the one they use for increased adoption of electronic health records, specifically meaningful use—could be helpful," Safavi said.