AI Developed to Analyze Medical Images Like Radiologists

Cardiff University

Input and expertise from radiologists can help develop better and more trustworthy artificial intelligence (AI) tools, new research shows.

The study used radiologists' eye movements to help guide AI systems to focus on the most clinically relevant areas of medical images.

When combined with other AI systems, this helps to improve diagnostic performance by up to 1.5% and align machine behaviour more closely with expert human judgment, according to the team of researchers from Cardiff University and the University Hospital of Wales (UHW).

Their findings, published in IEEE Transactions on Neural Networks and Learning Systems, could support decision making in diagnoses by radiologists and grow medical AI adoption to help meet some of the challenges facing the NHS.

Dr Richard White, a Consultant Radiologist at UHW and clinical lead on the study, said: "Computers are very good at identifying pathologies such as lung nodules based on their shape and texture. However, knowledge of where to look on imaging studies forms a key part of radiology training, and there are specific review areas that should always be evaluated.

"This research aims to bring these two aspects together to see if computers can evaluate chest radiographs more like a trained radiologist would.

"This is something that radiology AI research has previously lacked and a key step in improving trust in AI and the diagnostic capabilities of computers ."

The team created the largest and most reliable visual saliency dataset for chest X-rays to date – based on over 100,000 eye movements from 13 radiologists examining fewer than 200 chest X-rays.

This was used to train a new AI model, CXRSalNet, to help it predict the areas in an X-ray that are most likely to be important for diagnosis.

Professor Hantao Liu, lead researcher on the study from Cardiff University's School of Computer Science and Informatics, added: "Current AI systems lack the ability to explain how or why they arrive at a decision – something that is critical in healthcare.

"Meanwhile, radiologists bring years of experience and subtle perceptual skills to each image they review. Our work captures how experienced radiologists naturally focus their attention on important parts of chest X-rays. We used this eye-tracking data to "teach" AI to identify important features in chest X-rays."

By mimicking where radiologists look when making diagnoses in this way, we can help AI systems learn to interpret images more like a human expert would.

Professor Hantao Liu Professor of Human-Centric Artificial Intelligence

Director of International

Wales has a 32% shortfall in consultant radiologists and the figure for the UK stands at 29%, according to the 2024 census by the Royal College of Radiologists.

Meanwhile demand for imaging is rising significantly.

"There are similar problems across much of the world," explains Dr White.

If we can implement solutions such as this into clinical practice, it has the potential to considerably enhance radiological workflow and minimise delays in care as a consequence of reporting backlogs.

Dr Richard White

The team plans to further develop their approach to explore how the technology can be adapted for medical education and training and clinical decision support tools, to assist radiologists in making faster and more accurate diagnoses.

"Our current focus is expanding this approach to work across other imaging modalities such as CT and MRI scans," added Professor Liu.

"In particular, we're interested in applying this methodology to cancer detection, where early identification of subtle visual cues is critical and often challenging for both human readers and machines."

Their paper, ' Chest X-Ray Visual Saliency Modeling: Eye-Tracking Dataset and Saliency Prediction Mode ' is published in IEEE Transactions on Neural Networks and Learning Systems.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.