The AI That Disrupts Radiology Won’t Read X-Rays
Hinton Drops the AI-Bomb
Back in 2016, Geoffrey Hinton, a very prominent AI researcher, dropped a bomb at a Toronto conference about algorithmic pattern recognition in images. Speaking of new AI algorithms that could read x-rays and other images, he basically predicted the demise of radiologists, the doctors that read those images:
“People should stop training radiologists now. It’s just completely obvious that within five years deep learning is going to do better than radiologists.”
— AI researcher Geoffrey Hinton, 2016
“Deep learning” is an AI method that dramatically improves pattern recognition compared with what came before. It’s why Alexa understands when you ask her to play your music, and why you can search your smartphone for photos of “dog” (or even “German shepherd dog” and it just works.
Hinton’s remarks were as controversial as you might imagine. In a sense, though, Hinton himself may not have grasped the really revolutionary aspects of the “deep learning” technology he had helped to develop, falling prey to what I like to call “Jetsons thinking.”
Meet George Jetson
CREDIT: HANNA-BARBERA/EVERETT COLLECTION
If you’re not old enough to remember, the Jetsons was a cartoon show about a family living in the future. The show premiered in 1962 (though new episodes were produced as late as the 80s) and was supposed to take place in 2062.
The Jetsons’ world was full of amazing technology: they lived in impossibly-tall buildings, had a flying car, and their housekeeper was a robot.
As amazing and new as those things might have been, what was more revealing was what had not changed in 100 years: George and his wife Jane had two kids and a dog, George worked at a manufacturing company, and Jane . . . didn’t work. George’s family was pretty much exactly like a 1962 family, but with flying cars and taller houses. And robots.
The fallacy here is to imagine that amazing new technologies won’t change workflows or social or organizational power structures much at all. Jetsons thinking doesn’t predict, for example, that automobiles mean more economic and social power for women, or that the printing press means widespread literacy and a push for democracy.
Likewise, Geoff Hinton imagined futuristic hospitals where doctors will still be ordering x-rays — for example, to determine if someone had a chest infection, or pneumonia. The only difference from today is that an AI reads that x-ray, instead of a human radiologist. But assuming that doctors will always be at the center of healthcare is like imagining in 1982 that the US Postal Service will be in charge of email in 2022.
It’s now becoming apparent that AI may change the organization of healthcare in much more fundamental ways. A great example of this is ResApp Health, a company out of Australia.
Skip the X-Ray
As a primary care doctor, when I see a patient with a cough and fever, and am considering a diagnosis of pneumonia, I reach for my trusty stethoscope — nearly unchanged in design for at least 100 years. The stethoscope lets me transmit the breath sounds from the chest to my ears, and then my brain tries to match the sounds I’m hearing to my internal database of breath sounds gathered over decades of practice.
What I’m doing is pattern recognition, just like a radiologist (or an AI) looking at an x-ray.
The smart folks at ResApp decided to apply the same basic AI techniques to develop an algorithm that could listen to your breathing and tell if you have pneumonia. And to do it on a smartphone.
With their app (currently in testing and available for telehealth trials), you provide some information and then the app listens to your cough. If your cough sounds like pneumonia, the diagnosis is made.
Given that there are upwards of 20 million chest x-rays done every year in the US, with many of them done to “rule out pneumonia”, you might begin to think about the effects on radiology staffing, healthcare revenue, patient radiation exposure, etc, that could follow from the use of a smartphone app that replaces a lot of chest x-rays.
Healthcare to Selfcare
While ResApp is positioning their app as a tool for primary care doctors doing telehealth, there is absolutely no theoretical reason why — if it’s proven to work — it couldn’t just be used by the consumer themselves. If the app says “just a cold”, a parent with a sick child could at least skip the doctor visit (whether in person or by telehealth). If the app says “pneumonia”, well, that might mean some interaction with a practitioner — though it’s already been reported that the FDA is looking at allowing apps to prescribe drugs, so maybe the Amazon drone would just drop the antibiotics at your house.
And if companies like Apple, which is already building health features like heart arrhythmia detection into their Watch, were to add ResApp as a basic feature, what would that do to our understanding and treatment of (and the economics of) respiratory illness?
The point is that AI, by allowing anyone with a computer or smartphone to see patterns that up until now required lengthy and extensive training and human expertise, probably won’t just give us “faster doctors” (or robot housekeepers). It will more likely fundamentally change how we maintain our health and diagnose and treat illness, and shift more power and control to consumers.
[MEDIA] The AI That Disrupts Radiology Won’t Read X-Rays Hinton...
Currently unlisted. Proposed listing date: 4 SEPTEMBER 2024 #