We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

Features Partner Sites Information LinkXpress hp
Sign In
Advertise with Us
GLOBETECH PUBLISHING LLC

Download Mobile App




Generative AI Model Significantly Reduces Chest X-Ray Reading Time

By MedImaging International staff writers
Posted on 20 Mar 2025
Print article
Image: Examples of chest radiograph interpretations with and without AI–generated reports (Photo courtesy of Radiology, DOI: 10.1148/radiol.241646)
Image: Examples of chest radiograph interpretations with and without AI–generated reports (Photo courtesy of Radiology, DOI: 10.1148/radiol.241646)

The prompt and accurate interpretation of radiologic images is critical due to its significant impact on patient outcomes, as errors in interpretation can lead to changes in clinical management. Chest radiography is one of the most frequently performed radiologic exams, but its interpretation requires a high level of expertise and considerable time. Although radiologists are highly accurate, their interpretations often face scalability challenges due to the growing volume of imaging studies. This results in an increased workload, delays in diagnosis, disruptions in clinical workflows, and an increased risk of misinterpretation. Multimodal generative artificial intelligence (AI) technologies, which are capable of processing and generating diverse data types, including both images and text, hold the potential to advance radiology. A new study has evaluated the clinical value of a domain-specific multimodal generative AI model for interpreting chest radiographs, with the aim of improving the radiology workflow.

Researchers at Mass General Brigham (Boston, MA, USA), along with their collaborators, carried out a retrospective, sequential, multireader, multicase reader study. They used 758 chest radiographs from a publicly available dataset (2009-2017) to assess the effectiveness of AI-generated reports. Five radiologists interpreted the chest radiographs in two sessions: one without AI-generated reports and the other with AI-generated preliminary reports. Various factors, including reading times, reporting agreement (RADPEER), and quality scores (on a five-point scale), were assessed by two experienced thoracic radiologists. These metrics were compared between the two sessions conducted from October to December 2023. A generalized linear mixed model was employed to analyze the reading times, report agreement, and quality scores. Additionally, a subset of 258 chest radiographs was examined to evaluate the factual correctness of the reports, comparing the sensitivities and specificities between the two sessions using the McNemar test.

The study, published in Radiology, revealed that AI-generated reports reduced the average reading time for chest X-rays (CXRs) by 42% compared to radiologists' unassisted evaluation (19.8 seconds vs. 34.2 seconds). In the subset analysis of 258 cases, the researchers found that AI-generated reports resulted in nearly a 10% increase in sensitivity for detecting pleural lesions (87.4% vs. 77.7%) and a more than 6% increase in sensitivity for identifying a widened mediastinum (90.8% vs. 84.3%). Without AI assistance, the researchers observed a wide range of sensitivities (54.2% to 80.7%) and specificities (84.9% to 93.4%) among the five radiologists for detecting abnormalities on CXRs. However, when AI-generated reports were used, the range for sensitivity and specificity was narrower. The sensitivity rates ranged from 71.1% to 80.8%, while specificity ranged from 85.2% to 87.3%. The researchers concluded that the use of a domain-specific multimodal generative AI model enhanced both the efficiency and quality of radiology report generation.

New
HF Stationary X-Ray Machine
TR20G
Ultrasound Imaging System
P12 Elite
X-ray Diagnostic System
FDX Visionary-A
Digital X-Ray Detector Panel
Acuity G4

Print article

Channels

Ultrasound

view channel
Image: The Vave Universal Probe (Photo courtesy of Vave Health)

World's First Wireless, Handheld, Whole-Body Ultrasound with Single PZT Transducer Makes Imaging More Accessible

Ultrasound devices play a vital role in the medical field, routinely used to examine the body's internal tissues and structures. While advancements have steadily improved ultrasound image quality and processing... Read more

Imaging IT

view channel
Image: The new Medical Imaging Suite makes healthcare imaging data more accessible, interoperable and useful (Photo courtesy of Google Cloud)

New Google Cloud Medical Imaging Suite Makes Imaging Healthcare Data More Accessible

Medical imaging is a critical tool used to diagnose patients, and there are billions of medical images scanned globally each year. Imaging data accounts for about 90% of all healthcare data1 and, until... Read more
Copyright © 2000-2025 Globetech Media. All rights reserved.