Maybe Later

Radiology AI’s Blind Spot: The Simple Tools We Most Need

Radiology AI’s Blind Spot: The Simple Tools We Most Need

As we stand on what many call the precipice of generative AI, I keep asking myself one simple question: how is it possible, with all this AI capability around us, that some of the most basic tools radiologists have wanted for years still don’t exist or are not available for widespread clinical use?

I’ve written and podcasted plenty about AI in radiology, but this time I want to focus on one simple item at the top of my wishlist — a feature that should already be standard in every PACS workstation: a one-click lesion measurement tool.

Imagine this: a radiologist identifies a lesion, clicks on it, and the AI measures its size in two dimensions. Done. Simple, right? Radiologists have been talking about this for at least a decade. I remember hearing it in residency: “Someday this will exist.”

Well, where the heck is it?

The Reality

Some versions of such tools do exist — some hidden away in certain specialized systems— but they’re not integrated into the clinical workflow. If I can’t access it directly in PACS, it’s essentially useless to me.

Every modern PACS should support one-click measuring in 2025. I should be able to click, get a measurement, and move on. If I need to adjust some measurements for accuracy, fine. Even better would be if the measurement automatically inserted into my report, ideally in the correct section using AI-assisted report mapping. I’d even settle for hitting a key or issuing a quick voice command to insert it.

So why doesn’t such a straightforward, high-impact system — one that would immediately improve workflow for radiologists and hospitals alike — exist? The answer lies not in the technology, but in implementation failure.

Regulatory Bottlenecks

Any tool that influences diagnosis or quantitative assessment qualifies as a medical device under FDA regulation, even something as simple as measuring a lesion’s diameter. Vendors must secure costly FDA clearance — a slow process requiring validation across scanners, protocols, and patient populations.

It’s not a bad requirement; bad measurement tools can harm patients. But it’s expensive, and AI companies may not see a viable way to monetize something so basic. So, they don’t build it.

Integration Barriers

Radiologists need AI tools that work seamlessly inside their PACS — not separate programs, pop-ups, or side viewers. But PACS vendors guard their ecosystems with proprietary architectures, charge for integrations, and may resist third-party tools that aren’t home-grown or fully vetted.

The result: even simple AI features require complex engineering, vendor cooperation, and data-security approvals before they can live in PACS systems where radiologists actually work. Until that happens, our “one-click” tools directly within PACS remain hypothetical.

Economic Disincentives

If there were a reimbursement code for lesion measurement automation, these systems would already exist. But there isn’t. That means radiology groups would have to pay for them out of clinical revenue.

Would the efficiency gains justify some cost? Yes — if the tool were implemented well. But without seamless PACS integration, the benefits vanish, and the incentive disappears with them.

The Technical Challenge

Lesions aren’t uniform. They can be cystic, solid, enhancing, or mixed — and they appear differently across organs and modalities (CT, MRI, ultrasound, PET, angiography, etc.). They can have distinct borders and indistinct borders. They can be bright or dark, big or small, benign or malignant, real or a fake-out.

Building a single AI model that accurately measures anything a radiologist wants to measure is no small feat. While current algorithms can produce measurements, proving they’re reproducible, accurate, and safe across imaging techniques and hardware is difficult — and essential for FDA clearance.

Even if AI can make the measurement, it may not guarantee correctness under regulatory oversight. And if it’s wrong, and the radiologist uses it anyway, who’s liable — the vendor or the radiologist? Legal risk may discourage companies from releasing such tools.

Missing the Radiologist’s Voice

Finally — and maybe most importantly — too many AI companies building for radiology aren’t truly listening to radiologists. Their teams are brilliant in computer science, math, and engineering, but often lack deep clinical insight into daily workflow pain points.

Who understands radiology better than a radiologist? If developers partnered more closely with clinicians who live inside PACS every day, we’d already have the tools that matter most — like a one-click lesion measurement system.

In the End

Radiologists are, at their core, measurers. Many of us spend hundreds of hours each year quantifying lesions. If AI could help us measure faster with accuracy equal to a human radiologist, that would be transformative. And if it could measure even more precisely — with greater reproducibility over time — patient care would meaningfully improve.

While various AI measurement tools now exist for specific applications — and I use several regularly — their impact on efficiency has been mixed. Take, for example, AI software designed to automatically detect and measure pulmonary nodules, albeit through a separate pop up window that requires me to look at even more images per chest CT study. The initial excitement was understandable, but in practice, the promised time savings from such systems have not materialized, and the accuracy has been underwhelming. Too often, AI tools in radiology overpromise and underdeliver.

I’m still waiting for an AI application that truly wows me — such as one that can handle the most fundamental tasks we perform: in this case simply placing a ruler on an imaging finding.

So, while AI ethicists and policy makers debate the implications of imminent artificial general intelligence, I find myself reflecting on a simpler reality — that helpful, practical AI is absent for many of the most basic tasks radiologists perform.

It’s remarkable that, as we approach 2026, a true one-click measurement tool still feels like science fiction to many of us. I can already anticipate the comments from AI companies: “Check out what we’ve developed!” And please — prove me wrong. But experience tells me I’ll likely be disappointed, because whatever is in development probably won’t seamlessly integrate where I actually need it: the PACS on my clinical workstation. And when it does arrive, it will likely be overcomplicated, overdesigned, and overmarketed —unable to deliver the simple, seamless measurement function radiologists have been dreaming of for years.

Is it too much to hope for such an elegant, useful system? Perhaps. But the strangest part is that I’m still hoping for it as we head into 2026 — not 2016.

Check. out other Radiology Review Insider Articles on Artificial Intelligence:

Will AI Replace Radiologists?

Can we get Real about AI in Radiology?

AI in Breast Imaging: Hype, Hope, or Hazard?

The Radiology Review may receive commissions from any purchases made through links on this page.

Navigating the New ABR Oral Certifying Exam

Navigating the New ABR Oral Certifying Exam

0