Optics.org
daily coverage of the optics & photonics industry and the markets that it serves
Featured Showcases
Photonics West Showcase
Optics+Photonics Showcase
News
Menu
Research & Development

Deep learning model estimates cancer risk from breast density

12 Apr 2023

Mammographic density estimates based on the visual scores of radiologists.

Breast cancer is the most common cancer to affect women worldwide. According to the American Cancer Society, about 1 in 8 women in the United States will develop breast cancer in their lifetime. While it is not possible to entirely prevent breast cancer, various medical organizations advise regular screening to detect and treat cases at the early stage.

The breast density, defined as the proportion of fibro-glandular tissue within the breast, is often used to assess the risk of developing breast cancer. While various methods are available to estimate this measure, studies have shown that subjective assessments conducted by radiologists based on visual analogue scales are more accurate than any other method.

As expert evaluations of breast density play a crucial role in breast cancer risk assessment, developing image analysis frameworks that can automatically estimate this risk, with the same accuracy as an experienced radiologist, is highly desirable. To this end, researchers led by Prof. Susan M. Astley from the University of Manchester, United Kingdom, recently developed and tested a new deep learning-based model capable of estimating breast density with high precision. Their findings are published in the Journal of Medical Imaging.

Deep learning-based model

“The advantage of the deep learning-based approach is that it enables automatic feature extraction from the data itself,” said Astley. “This is appealing for breast density estimations since we do not completely understand why subjective expert judgments outperform other methods.”

Typically, training deep learning models for medical image analysis is a challenging task owing to limited datasets. However, the researchers managed to find a solution to this problem: instead of building the model from the ground up, they used two independent deep learning models that were initially trained on ImageNet, a non-medical imaging dataset with over a million images. This approach, known as “transfer learning,” allowed them to train the models more efficiently with fewer medical imaging data.

Using nearly 160,000 full-field digital mammogram images that were assigned density values on a visual analogue scale by experts (radiologists, advanced practitioner radiographers, and breast physicians) from 39,357 women, the researchers developed a procedure for estimating the density score for each mammogram image. The objective was to take in a mammogram image as input and churn out a density score as output.

The procedure involved preprocessing the images to make the training process computationally less intensive, extracting features from the processed images with the deep learning models, mapping the features to a set of density scores, and then combining the scores using an ensemble approach to produce a final density estimate.

Model comparable to human expertise

With this approach, the researchers developed highly accurate models for estimating breast density and its correlation with cancer risk, while conserving the computation time and memory. “The model’s performance is comparable to those of human experts within the bounds of uncertainty,” said Astley. “Moreover, it can be trained much faster and on small datasets or subsets of the large dataset.”

Notably, the deep transfer learning framework is useful not only for estimating breast cancer risk in the absence of a radiologist but also for training other medical imaging models based on its breast tissue density estimations. This, in turn, can enable improved performance in tasks such as cancer risk prediction or image segmentation.

The full Open Access article by S. Squires et al., “Automatic assessment of mammographic density using a deep transfer learning method,” was first published in Journal of Medical Imaging.

Changchun Jiu Tian  Optoelectric Co.,Ltd.Mad City Labs, Inc.Berkeley Nucleonics CorporationIridian Spectral TechnologiesECOPTIKLaCroix Precision OpticsSacher Lasertechnik GmbH
© 2024 SPIE Europe
Top of Page