AI fails to pass radiology qualifying examination

AI candidate could not outperform radiologists, but further training may improve results, say researchers

Artificial intelligence (AI) is currently unable to pass one of the qualifying radiology examinations, suggesting that this promising technology is not yet ready to replace doctors, finds a study in the Christmas issue of The BMJ.

AI is increasingly being used for some tasks that doctors do, such as interpreting radiographs (x-rays and scans) to help diagnose a range of conditions.

But can AI pass the Fellowship of the Royal College of Radiologists (FRCR) examination, which UK trainees must do to qualify as radiology consultants?

To find out, researchers compared the performance of a commercially available AI tool with 26 radiologists (mostly aged between 31 and 40 years; 62% female) all of whom had passed the FRCR exam the previous year.

They developed 10 'mock' rapid reporting exams, based on one of three modules that make up the qualifying FRCR examination that is designed to test candidates for speed and accuracy.

Each mock exam consisted of 30 radiographs at the same or a higher level of difficulty and breadth of knowledge expected for the real FRCR exam. To pass, candidates had to correctly interpret at least 27 (90%) of the 30 images within 35 minutes.

The AI candidate had been trained to assess chest and bone (musculoskeletal) radiographs for several conditions including fractures, swollen and dislocated joints, and collapsed lungs.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.