AI system integrates eye gaze data and radiology reports

A “collaborative AI system” that integrates eye gaze data and radiology reports may be able to correct perceptual errors when radiologists interpret chest x-rays, according to a study published July 16 in Radiology: Artificial Intelligence

Researchers at the University of Houston have named the system “CoRax” (Collaborative Radiology Expert) and say that the model has potential to provide real-time feedback. 

“CoRax is unique because of its collaborative nature and the combination of eye gaze with report and image to better understand and assist radiologists, unlike most existing standalone systems,” said lead author Akash Awasthi, a doctoral student, in a news release from RSNA. 

A key challenge in radiology is the prevalence of perceptual errors, which significantly impact diagnostic accuracy. These errors occur when radiologists fail to detect or correctly interpret abnormalities due to visual oversights during the initial interpretation, the authors explained. 

While there are various insights into the causes and potential manual solutions for perceptual errors, to date, no AI-based models have been developed to address these errors based on the individual visual search patterns of radiologists, they added. 

To that end, the researchers combined the public datasets REFLACX and EGDCXR — eye tracking and eye gaze datasets — with a multilabel transformer classifier designed to predict multiple labels corresponding to a given chest x-ray image (ChexFormer). The system also includes a Spatio-Temporal Abnormal Region Extractor (STARE) module, which enables the processing of eye gaze fixation. 

In brief, the system is designed to function as a virtual second reader, where radiologists would submit radiographic images, reports, and eye gaze data, with CoRaX then generating referrals for further assessment.

An overview of the collaborative system, CoRaX. The system seamlessly
integrates radiology reports, eye gaze data, and chest x-rays (CXR) to offer targeted
recommendations. Then the radiologist uses these recommendations and either accepts them or rejects them.
Radiology: Artificial Intelligence
In the study, the group evaluated CoRaX using two simulated error datasets: a random masking-based error dataset and an uncertainty-masking-based error dataset. The datasets featured five abnormalities: cardiomegaly, edema, atelectasis, pleural effusion, and lung opacity. They also introduced a metric called the “Interaction score,” which served as an indicator of the diagnostic accuracy of each interaction between the system and radiologists and measured whether any perceptual errors were overlooked. 

According to the analysis, the system corrected 21.3% (71/332) of errors in the random masking dataset and 34.6% (115/332) in the uncertainty masking dataset, with particularly strong performance in identifying missed cardiomegaly cases, the researchers reported. 

In addition, it achieved a mean Interpretable Referral Accuracy score of 63% on the random masking dataset and 58% on the uncertainty masking dataset. Finally, CoRaX provided diagnostic aid in 85.7% (240/280) and 78.4% (233/297) of interactions in each dataset, as measured by the Interaction Score. 

“The CoRaX system can collaborate efficiently with radiologists and address perceptual errors across various abnormalities in chest radiographs,” the group wrote. 

Ultimately, the current study focused primarily on technical system development, with direct real-world validation with radiologists remaining as a future goal, Awasthi and colleagues wrote. Nonetheless, CoRaX shows promise, they added. 

“The modular architecture enables future enhancements, such as replacing the multilabel classifier (e.g., ChexFormer) with more advanced models. This approach lays the groundwork for robust, error-resistant AI systems and paves the way for future clinical trials and broader adoption,” they concluded. 

The full study is available here

Continue Reading