Industry coalition finalizes clinical decision support guidelines aimed at self-regulation

Doctors talking
Finalized guidelines from the CDS Coalition provide a baseline for self-regulating machine learning technology.

A coalition of organizations interested in making or using clinical decision support software has finalized voluntary guidelines aimed at self-regulating new technology that uses data and machine learning to assist physicians with more accurate diagnoses.

The Clinical Decision Support Coalition, led by Bradley Merrill Thompson, a medical device attorney with Epstein Becker Green, finalized draft guidelines for CDS software released in May after reviewing public comments. Although the group made several tweaks based on the comments they received, the primary emphasis of the guidelines remained focused on ensuring CDS software is transparent, is intended for a competent user—like a primary care physician or a specialist—and provides sufficient time for the user to reflect on the recommendations.

RELATED: Increasingly powerful AI systems are accompanied by an 'unanswerable' question

There are particular concerns that the power behind machine learning and artificial intelligence software makes it difficult to understand or explain by creating a black box of deep learning. Neural networks can offer a diagnosis, but software can’t tell you how it reached that conclusion. The Department of Defense has invested $75 million in trying to uncover more transparent AI techniques.

The guidelines attempt to tackle that issue, outlining some steps developers can take to enhance transparency by “explaining what can be explained" and revealing data sources when possible.

RELATED: FDA unveils precertification pilot program for digital health technology, maps out upcoming guidance

Under the 21st Century Cures Act, CDS software falls outside of the FDA’s jurisdiction. The agency plans to release draft guidance that delineates the types of CDS software that falls outside its regulatory scope. Thompson believes it’s up to the industry to self-regulate machine learning software to ensure it remains outside of FDA oversight.

“The legislation provides an avenue for FDA to clawback into regulated territory any software the Agency finds may lead to serious injury or death in patients,” Thompson wrote in a LinkedIn post. “If industry does an adequate job of self-regulating and therefore avoiding patient injury, we can reduce the likelihood that FDA will need to expand the scope of its regulation.”

Thompson added that the CDS Coalition was only a temporary holdover, and the group is actively seeking out an organization that can regularly update the guidelines moving forward.