Artificial intelligence and neural networks are poised to take over the medical industry, particularly image-centric specialties like radiology and dermatology, thanks to the technology’s ability to mimic the human brain.
Understanding exactly how those systems diagnose diseases isn’t as easy.
A study published in January found that artificial intelligence was just as accurate at diagnosing instances of skin cancer as 21 board certified dermatologists, raising the prospect that the diagnostic capabilities of machines could soon surpass humans.
Sebastian Thurn, one of the authors of that study and a computer science professor at Stanford University, told the New Yorker that the success of that study relied on building a neural network with a basic framework in place: The ability to recognize unrelated images. Similar to the way a child learns to differentiate between a dog and cat, the ability to align neural networks with human learning improves the speed at which AI can process information, offering seemingly boundless potential within the healthcare industry.
But the neural networks that make AI so powerful also make them difficult to understand. The same way a radiologist might use intuition derived from years of experience to make a medical decision, AI leaves behind a “black box” of deep learning that makes it difficult to understand the process behind its decision-making.
“The more powerful the deep-learning system becomes, the more opaque it can become,” Geoffrey Hinton, a computer scientist at the University of Toronto told the New Yorker. “As more features are extracted, the diagnosis becomes increasingly accurate. Why these features were extracted out of millions of other features, however, remains an unanswerable question.”
Hinton, who has been studying deep learning for decades, added that neural networks and machine learning are preparing to outpace radiologists when it comes to making an accurate, speedy diagnosis. In December, two researchers argued that radiologists and pathologists should merge into a single role as “information specialists” to make room for AI advancements that can quickly identify fractures or abnormalities.
“I think that if you work as a radiologist you are like Wile E. Coyote in the cartoon,” Hinton said. “You’re already over the edge of the cliff, but you haven’t yet looked down. There’s no ground underneath.”