We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

Features Partner Sites Information LinkXpress hp
Sign In
Advertise with Us
GLOBETECH PUBLISHING LLC

Download Mobile App




Researchers Use AI to Improve Mammogram Interpretation

By MedImaging International staff writers
Posted on 04 Jul 2018
Print article
Image: Researchers used AI to improve mammogram image interpretation (Photo courtesy of the Department of Energy’s Oak Ridge National Laboratory).
Image: Researchers used AI to improve mammogram image interpretation (Photo courtesy of the Department of Energy’s Oak Ridge National Laboratory).
A team of researchers at the Department of Energy’s Oak Ridge National Laboratory (Oak Ridge, TN, USA) successfully used artificial intelligence to improve understanding of the cognitive processes involved in image interpretation. Their work, which was published in the Journal of Medical Imaging, will help reduce errors in the analyses of diagnostic images by health professionals and has the potential to improve health outcomes for women affected by breast cancer.

Early detection of breast cancer is critical for effective treatment, which requires accurate interpretation of a patient’s mammogram. The ORNL-led team of researchers found that analyses of mammograms by radiologists were significantly influenced by context bias, or the radiologist’s previous diagnostic experiences. New radiology trainees were most susceptible to the phenomenon, although even more experienced radiologists fall victim to some degree, according to the researchers.

The researchers designed an experiment aimed at following the eye movements of radiologists at various skill levels to better understand the context bias involved in their individual interpretations of the images. The experiment followed the eye movements of three board certified radiologists and seven radiology residents as they analyzed 100 mammographic studies from the University of South Florida’s Digital Database for Screening Mammography. The 400 images, representing a mix of cancer, no cancer, and cases that mimicked cancer but were benign, were specifically selected to cover a range of cases similar to that found in a clinical setting.

The participants, who were grouped by levels of experience and had no prior knowledge of what was contained in the individual X-rays, were outfitted with a head-mounted eye-tracking device designed to record their “raw gaze data,” which characterized their overall visual behavior. The study also recorded the participants’ diagnostic decisions via the location of suspicious findings along with their characteristics according to the BI-RADS lexicon, the radiologists’ reporting scheme for mammograms. By computing a measure known as a fractal dimension on the individual participants’ scan path (map of eye movements) and performing a series of statistical calculations, the researchers were able to discern how the eye movements of the participants differed from mammogram to mammogram. They also calculated the deviation in the context of the different image categories, such as images that show cancer and those that may be easier or more difficult to decipher.

In order to effectively track the participants’ eye movements, the researchers had to employ real-time sensor data, which logs nearly every movement of the participants’ eyes. However, with 10 observers interpreting 100 cases, the data soon began adding up, making it impractical to manage such a data-intensive task manually. This made the researchers turn to artificial intelligence to help them efficiently and effectively make sense of the results. Using ORNL’s Titan supercomputer, the researchers were able to rapidly train the deep learning models required to make sense of the large datasets. While similar studies in the past have used aggregation methods to make sense of the enormous data sets, the team of researchers at ORNL processed the full data sequence, a critical task as over time this sequence revealed differentiations in the eye paths of the participants as they analyzed the various mammograms.

In a related paper published in the Journal of Human Performance in Extreme Environments, the researchers demonstrated how convolutional neural networks, a type of artificial intelligence commonly applied to the analysis of images, significantly outperformed other methods, such as deep neural networks and deep belief networks, in parsing the eye tracking data and, by extension, validating the experiment as a means to measure context bias. Furthermore, while the experiment focused on radiology, the resulting data drove home the need for “intelligent interfaces and decision support systems” to assist human performance across a range of complex tasks including air-traffic control and battlefield management.

While machines are unlikely to replace radiologists (or other humans involved in rapid, high-impact decision-making) any time soon, they do hold enormous potential to assist health professionals and other decision makers in reducing errors due to phenomena such as context bias, according to Gina Tourassi, team lead and director of ORNL’s Health Data Science Institute. “These findings will be critical in the future training of medical professionals to reduce errors in the interpretations of diagnostic imaging. These studies will inform human/computer interactions, going forward as we use artificial intelligence to augment and improve human performance,” said Tourassi.

Related Links:
Oak Ridge National Laboratory

New
Gold Member
X-Ray QA Meter
T3 AD Pro
Radiology Software
DxWorks
Wall Fixtures
MRI SERIES
New
Opaque X-Ray Mobile Lead Barrier
2594M

Print article
Radcal

Channels

Radiography

view channel
Image: The new X-ray detector produces a high-quality radiograph (Photo courtesy of ACS Central Science 2024, DOI: https://doi.org/10.1021/acscentsci.4c01296)

Highly Sensitive, Foldable Detector to Make X-Rays Safer

X-rays are widely used in diagnostic testing and industrial monitoring, from dental checkups to airport luggage scans. However, these high-energy rays emit ionizing radiation, which can pose risks after... Read more

MRI

view channel
Image: Artificial intelligence models can be trained to distinguish brain tumors from healthy tissue (Photo courtesy of 123RF)

AI Can Distinguish Brain Tumors from Healthy Tissue

Researchers have made significant advancements in artificial intelligence (AI) for medical applications. AI holds particular promise in radiology, where delays in processing medical images can often postpone... Read more

Nuclear Medicine

view channel
Image: Example of AI analysis of PET/CT images (Photo courtesy of Academic Radiology; DOI: 10.1016/j.acra.2024.08.043)

AI Analysis of PET/CT Images Predicts Side Effects of Immunotherapy in Lung Cancer

Immunotherapy has significantly advanced the treatment of primary lung cancer, but it can sometimes lead to a severe side effect known as interstitial lung disease. This condition is characterized by lung... Read more

General/Advanced Imaging

view channel
Image: Cleerly offers an AI-enabled CCTA solution for personalized, precise and measurable assessment of plaque, stenosis and ischemia (Photo courtesy of Cleerly)

AI-Enabled Plaque Assessments Help Cardiologists Identify High-Risk CAD Patients

Groundbreaking research has shown that a non-invasive, artificial intelligence (AI)-based analysis of cardiac computed tomography (CT) can predict severe heart-related events in patients exhibiting symptoms... Read more

Imaging IT

view channel
Image: The new Medical Imaging Suite makes healthcare imaging data more accessible, interoperable and useful (Photo courtesy of Google Cloud)

New Google Cloud Medical Imaging Suite Makes Imaging Healthcare Data More Accessible

Medical imaging is a critical tool used to diagnose patients, and there are billions of medical images scanned globally each year. Imaging data accounts for about 90% of all healthcare data1 and, until... Read more
Copyright © 2000-2024 Globetech Media. All rights reserved.