It’s been six years since the Food and Drug Administration said it would begin developing guidance for clinical decision support (CDS) software.
Now that the guidance is finally here, some industry experts are feeling slightly underwhelmed. For Bradley Merrill Thompson, a medical device attorney with Epstein Becker & Green in Washington, D.C., the FDA’s guidance “stopped at the edge” of the regulatory framework outlined in the 21st Century Cures Act, which carves out an oversight exemption for software that can be independently reviewed by physicians.
“I’m afraid I was expecting much more,” Thompson wrote in an email to FierceHealthcare. “I was expecting a guidance that focuses on risk, rather than simply explaining the statutory language in slightly greater detail.”
That reluctance to embrace a risk-based approach to CDS software is Thompson’s biggest complaint. He argues that software that predicts the likelihood of a migraine, for example, has a far lower risk to patient safety than a tool that recommends a specific chemotherapy treatment, which could be a life or death decision.
Instead, in its draft recommendations issued on Thursday, the agency fell in line with the approach outlined in Cures, devoting most of its focus to the transparency of CDS software and the ability for physicians and patients to understand the basis for the software's recommendation.
“What I think many of us in industry were hoping for was an effort by FDA to distinguish high from low risk as a basis for regulation,” he wrote. “We didn’t get that. Worse, it appears based on the guidance that FDA is not interested in drawing that line.”
For his part, FDA Commissioner Scott Gottlieb, M.D., believes the guidelines will spur innovation among developers. During a hearing before the Senate Committee on Health, Education, Labor and Pensions on Thursday, Gottlieb said “regulatory ambiguity” has stifled the use of CDS tools in healthcare. The new guidelines, he argued, would create “some really bright lines and parameters” for FDA oversight and retain the agency's right to exercise enforcement discretion.
“I’m hopeful we’re going to see more innovation in this space,” he said. “Tools that could sit on top of the electronic health record, for example, and help physicians make decisions from that information.”
One nagging concern is that the guidance fails to consider the next generation of medical software that integrates machine learning and artificial intelligence. In September, the Clinical Decision Support Coalition finalized guidelines designed to provide a pathway for software that uses complex algorithms to proceed without going through FDA approval. But Thompson says the FDA “completely ignored the topic,” adding that the idea that FDA will regulate all software regardless of risk is “extremely troublesome.”
Bethany Hills, chair of Mintz Levin's FDA practice, echoed the need for additional guidance on how the agency plans to regulate artificial intelligence, pointing out the questions that are bound to arise when AI’s “black box” of information fails to meet the FDA’s focus on transparency.
But she also praised Bakul Patel, the associate director for digital health at the FDA’s Center for Devices and Radiological Health, for “leading a very consistent march toward rational oversight” of digital health products.
“Certainly just the fact that there is guidance out there immediately changes the tone of everything,” she said. “Now we know where the goal posts are and we can try and get between them. That provides the industry with more certainty, particularly investors.”
One way the FDA did show a willingness to push beyond the boundaries of the Cures Act was by incorporating patient decision support tools into its CDS guidance. This unexpected addition is particularly helpful given the influx of new companies that are building technology to support patients making healthcare decisions.
“I have a whole stack of clients I’m in the process of reaching out to that are much more geared toward patient and consumer markets,” she said.