The impact of explainable AI on non-expert x-ray interpreters

Nature Scientific Communications - January 2023

Susanne Gaube, Harini Suresh, Martina Raue, Eva Lermer, Timo K. Koch, Matthias F. C. Hudecek, Alun D. Ackery, Samir C. Grover, Joseph F. Coughlin, Dieter Frey, Felipe C. Kitamura, Marzyeh Ghassemi & Errol Colak

  • Background: The authors investigated how the explainability and source of AI-generated advice affect physicians’ decision-making when reviewing chest X-rays. They hypothesized that providing visual annotations and labeling the advice as coming from an AI system would improve diagnostic accuracy, advice quality ratings, and confidence in the diagnosis.

  • Methods: The authors conducted an online experiment with 223 physicians (106 radiologists and 117 IM/EM physicians) who reviewed eight chest X-rays with correct diagnostic advice. The advice was manipulated in two ways: (1) explainability (with or without annotations on the X-rays) and (2) source (AI or human). The dependent variables were diagnostic accuracy, advice quality ratings, and confidence in the diagnosis.

  • Results: The authors found that receiving annotated advice from an AI system resulted in the highest diagnostic accuracy for both groups of physicians. Non-task experts (IM/EM physicians) rated the quality of annotated advice higher than non-annotated advice, while task experts (radiologists) were unaffected by explainability. Task experts reported higher confidence in their diagnosis when receiving AI advice, while non-task experts were unaffected by the source of advice.

  • Limitations: The authors acknowledged some limitations of their study, such as the online setting, the small number of cases, the high performance of task experts, and the possibility of repeated participation by some individuals.

  • Conclusions: The authors concluded that explainable AI advice has the potential to improve diagnostic decision-making, especially for non-task experts who review medical images. They suggested that AI-CDSS could be valuable for IM and EM physicians who often have to make timely clinical decisions without radiology reporting. They also called for more research on how to present explainable AI advice to optimize utility and minimize over-reliance.

Experimental setup. Every participant reviewed all eight cases. Each case consisted of a brief patient vignette, a chest X-ray, and diagnostic advice (radiologic findings and primary diagnoses). The advice came either with or without annotations on the X-ray. Additionally, the advice was labeled as coming either from an AI system or an experienced radiologist. Physicians were asked to give a final diagnosis, rate the quality of the advice, and judge how confident they were with their diagnosis.


The study’s findings can be extrapolated to veterinarians and veterinary radiologists in several ways:

  1. Potential for Improved Diagnostic Accuracy: The study found that explainable AI advice improved diagnostic accuracy, especially for non-experts. This suggests that veterinarians, could also benefit from explainable AI in terms of improved diagnostic accuracy.

  2. Value of Annotations: The study showed that annotated advice was rated higher in quality by non-experts. This indicates that if veterinarians, who may not be experts in radiology, were to use an explainable AI system, they might find annotated advice particularly helpful.

  3. Increased Confidence: The study found that task experts had higher confidence in their diagnosis when receiving AI advice. This suggests that veterinary radiologists might also experience increased confidence in their diagnoses with the aid of AI.

  4. Timely Decisions: The study suggested that AI-CDSS could be valuable for physicians who often have to make timely clinical decisions without radiology reporting. This could also apply to veterinarians who need to make quick decisions.

  5. Need for Tailored AI Systems: The study highlights the need for more research on how to present explainable AI advice to optimize utility and minimize over-reliance. This underscores the importance of developing AI systems that are tailored to the specific needs and expertise levels of veterinarians and veterinary radiologists.

In conclusion, while the study was not specifically directed at veterinarians or veterinary radiologists, its findings suggest that there could be significant benefits to incorporating explainable AI systems into veterinary practice. However, further research is needed to confirm these potential benefits and to optimize the design of such systems for veterinary use.

How did we do?

Login or Subscribe to participate in polls.

Disclaimer: The summary generated in this email was created by an AI large language model. Therefore errors may occur. Reading the article is the best way to understand the scholarly work. The figure presented here remains the property of the publisher or author and subject to the applicable copyright agreement. It is reproduced here as an educational work. If you have any questions or concerns about the work presented here, reply to this email.