A Letter to the Media: Can We Please Get Real About AI in Radiology?
This morning, on my drive to work, I was listening to a well-known finance podcast. The host was chatting about branding, marketing, and artificial intelligence—then dropped this little gem (paraphrased):
“Look at radiology. You can now get AI to read your X-ray with over 95% accuracy, and it’s free. We don’t need radiologists anymore.”
As a radiologist who just finished a late shift reading those very X-rays, I nearly swerved off the road. Not because I was offended (okay, maybe a little), but because this statement is so confidently wrong, it’s almost impressive.
Let’s break it down:
First, I am a radiologist. I read X-rays, CTs, MRIs, ultrasounds—sometimes until midnight. If there were an AI that could do my job for free with over 95% accuracy, I promise you: I’d be asleep a lot earlier.
Second, neither I nor any of my radiologist colleagues have ever seen this magical free 95%-accurate tool. If it exists, it’s hiding better than Waldo. And if it ever did exist, trust me, it wouldn’t be free—it would cost as much as humanly (or artificially) possible.
Third, what do people even mean when they say “read an X-ray”? Which X-ray? A foot? A chest? Is the AI detecting a subtle fracture or just playing radiology bingo? Reading one specific pattern isn’t the same as interpreting a complex image with a dozen potential findings.
Fourth, why is a marketing expert—with zero experience reading X-rays, providing medical care, or probably even pronouncing “scaphoid”—so confident in their radiology hot take?
Now, full disclosure: I do use AI. On my last shift, I used several AI tools. None could independently read a single study. Most did a decent—but far from dazzling—job at their one specific task.
For example, one program is designed to flag lung nodules on CT scans. It helpfully pointed out a couple I might have missed... except I had already seen other nodules in those cases, so the AI didn't actually change anything. In return, it cost me time—because the AI doesn’t just mark things on my regular images. No, it creates a whole new set of images I have to open, review, and interpret separately. It also can’t tell me if a nodule is stable compared to any available prior CT scans which is a critical and time-consuming task. It also doesn’t provide any recommendation as to appropriate next steps for any lung nodule it detects. Super helpful.
Worse, for every “maybe helpful” nodule, I had to scroll through 10–20 false positives. Some “nodules” turned out to be pneumonia. Others were surgical scarring. Some were just the lung being its weird, normal self.
There’s also an AI that spots fractures on x-rays. It’s occasionally useful—unless the fracture is in a rib. Then the program shrugs and steps aside, because ribs weren’t in its training. And once again, I had to delay my work just to wait for the AI to finish running, open a new report, and cross-reference what I had already seen with my own eyes. Fun!
Was it nice to see that the AI agreed with me that a bone wasn’t broken? Sure. But was it worth the delay and extra work? Not really. Especially when it didn’t change a single patient’s care.
Let’s talk mammograms—I’m a breast radiologist too. No AI vendor out there is willing to have their product actuallyread a mammogram independently. The best they offer is a risk score. Not a probability, not a diagnosis. Just an arbitrary number between 0 and 100 that says: “Hey, this image might have something sketchy.”
Helpful? Occasionally. Game changing? Not really. It’s still 100% up to me to decide what to do about it. In some cases, high scores are false alarms. Low scores might miss cancer. So, in the end, radiologists are still making the calls.
Yet if you listen to the media, AI is already revolutionizing radiology and putting me out of a job. Let me assure you: hospitals would love for that to be true. Radiologists are in short supply. But it’s not true. Not even close.
So, here’s my friendly PSA to the public and the press:
AI is not “replacing” radiologists. It’s barely assisting us.
We’re not anti-AI—we’d love to have great tools that make us faster and better. But the current reality? AI in radiology is like a well-meaning intern who occasionally finds something useful but needs a lot of supervision and takes more time than they save.
To be fair, I do use AI in one area that’s helpful: grammar. If this article sounds polished, you can thank a large language model for cleaning up my tired late-night ramblings.
But AI reading your X-ray for free? Not happening.
From someone inside the reading room, I can tell you: the “AI Radiologist” of legend isn’t here yet. And until it shows up, I’ll be here—with my Coke Zero, my multi-monitored workstation, and a long list of studies to interpret.
Do you want to contribute to The Radiology Review Journal?
The article submission process is simple: email your proposed article to theradiologyreview@gmail.com. Include with your article your name and professional affiliation. Your best writing is welcome with no specific word limit or formatting requirements. If presenting material wherein references are appropriate, or websites are discussed, please provide a reference section at the end of the article in any reasonable format. Submission of every article is appreciated but submission does not guarantee publication. Click here for more information.