Finally! That was my first thought when app store Happtique announced last week that it planned not only to sell health apps, but also to certify their efficacy. It's something I've written about before, and a process I'm excited to see play out.
FierceMobileHealthcare talked with three of the four individuals who have been enlisted to create the certification program for Happtique--Howard Luks, an orthopedic surgeon and professor at New York Medical College; Shuva Roy, the biomedical microdevices director for University of California, San Francisco; and Dave deBronkart, a health blogger and the consumer-facing specialist of the group--who indicated that they're scheduled to hold their first meeting in two to three weeks to sketch out the initial shape of the program, which is due to Happtique by July.
To start, I asked if the program will provide a simple seal of approval, or actually deliver granular grading or rating information. The answer, I was happy to hear, was the latter. Luks said the idea is to provide a grading system that will be attached to all apps in the Happtique store, showing where an app performed well, and where it fell short.
Next, I asked if the program will test actual products, with physicians or tech specialists getting hands-on with individual apps, or if it will certify the design and development process of the app. It's a question all three of the panelists had been pondering, but haven't reached a clear answer yet. Luks and Roy did acknowledge that true product testing could be a herculean undertaking with tens of thousands of health apps already on the market, and more emerging every day.
Corey Ackerman, Happtique's founder, however, told GigaOM last week that he envisioned "a set of standards for apps judged by actual doctors who treat that issue. For example, oncologists won't review diabetes apps." So the scale might be tipping toward a program that actually test-drives apps before certifying them.
With regard to exactly what the panel will be certifying, they said that those decisions won't be made until at least their first meeting. They indicated, though, they're looking to include criteria for app quality, reliability, usability, consumer engagement, value to the user, cost, simplicity, and interoperability.
The panelists were adamant about having a strong clinical element to evaluate the medical viability of apps. "We're looking for evidence-based medicine, proven algorithms, mobile health guides that offer the patient or the enterprise realistic guidance, and realistic, actionable information," Luks said.
Reliability was a key concern, as well. The program will need carefully examine rates of false positive or negative results, incorrect data collection, mistakes in algorithms, or even full app failures, Roy said.
Security, too, was on the panelists' minds. The program will need to determine core security measures such as encryption and password protections, but also evaluate an app's vulnerability to mal- or spyware, and the security of any stored data.
The program may even go so far as to vet the interoperability of apps, determining which can be used on different devices and platforms, and possibly how they interface with different downstream systems such as EHRs, according to Roy.
What's more, the panelists indicated that they also want to include criteria to weed out apps with any significant conflicts of interest. For example, apps from pharmaceutical companies that drive users to their newest drugs would get a big red flag.
Still, while the panelists had strong feelings about many of these criteria, they stressed that the process is wide open yet, and none of these particulars will be nailed down until they've had a few meetings.
Ultimately, I see a huge upside to an app certification program for the healthcare industry. But I also see a huge challenge facing this panel, one they'll be hard-pressed to hurdle in the short six-month window they've been given. I'll certainly be keeping up with our panelists in the coming months to see how things progress. - Sara