Author
Listed:
- Frimpong Twum
- Charlyne Carol Eyram Ahiable
- Stephen Opoku Oppong
- Linda Banning
- Kwabena Owusu-Agyemang
Abstract
Breast cancer remains a critical global health concern, affecting countless lives worldwide. Early and accurate detection plays a vital role in improving patient outcomes. The challenge lies with the limitations of traditional diagnostic methods in terms of accuracy. This study proposes a novel model based on the four pretrained deep learning models, Mobilenetv2, Inceptionv3, ResNet50, and VGG16, which were also used as feature extractors and fed on multiple supervised learning models using the BUSI dataset. Mobiletnetv2, inceptionv3, ResNet50 and VGG16 achieved an accuracy of 85.6%, 90.8%, 89.7% and 88.06%, respectively, with Logistic Regression and Light Gradient Boosting Machine being the best performing classifiers. Using transfer learning, the top layers of the model were frozen, and additional layers were added. A GlobalAveragePooling2D layer was employed to reduce spatial dimensions of the input image. After training and testing based on the accuracy, ResNet50 performed the best with 95.5%, followed by Inceptionv3 92.5%, VGG16 86.5% and lastly Mobilenetv2 84%.Author summary: Breast cancer is one of the leading causes of mortality among women worldwide, and early detection plays a crucial role in improving survival rates. In this study, we explore the use of transfer learning, a technique that leverages pretrained deep learning models, to improve breast cancer classification using ultrasound images. We compare four well-established models—MobileNetV2, InceptionV3, ResNet50, and VGG16—evaluating their effectiveness in both feature extraction and fine-tuning approaches. Our findings show that ResNet50 and InceptionV3 outperform other models in accuracy, specificity, and robustness, making them strong candidates for clinical applications. MobileNetV2, while slightly less accurate, offers computational efficiency, making it suitable for real-time or resource-limited settings. By providing a comprehensive comparison of these models, our study contributes to the development of AI-driven breast cancer detection tools that can aid in faster, more reliable diagnoses.
Suggested Citation
Frimpong Twum & Charlyne Carol Eyram Ahiable & Stephen Opoku Oppong & Linda Banning & Kwabena Owusu-Agyemang, 2025.
"Employing transfer learning for breast cancer detection using deep learning models,"
PLOS Digital Health, Public Library of Science, vol. 4(6), pages 1-22, June.
Handle:
RePEc:plo:pdig00:0000907
DOI: 10.1371/journal.pdig.0000907
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pdig00:0000907. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: digitalhealth (email available below). General contact details of provider: https://journals.plos.org/digitalhealth .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.