Author
Listed:
- Haidar A. AlMubarak
(Missouri University of Science and Technology, Rolla, USA & Advanced Lab for Intelligent Systems Rresearch, Department of Computer Engineering, College of Information and Computer Sciences, King Saud University, Riyadh, Saudi Arabia & Electrical and Computer Engineering Department, Missouri University of Science and Technology, Rolla, USA)
- Joe Stanley
(Missouri University of Science and Technology, Rolla, USA)
- Peng Guo
(Missouri University of Science and Technology, Rolla, USA)
- Rodney Long
(Lister Hill National Center for Biomedical Communications for National Library of Medicine, Bethesda, USA)
- Sameer Antani
(Lister Hill National Center for Biomedical Communications for National Library of Medicine, Bethesda, USA)
- George Thoma
(Lister Hill National Center for Biomedical Communications for National Library of Medicine, Bethesda, USA)
- Rosemary Zuna
(Department of Pathology for the University of Oklahoma Health Sciences Center, Oklahoma City, USA)
- Shelliane Frazier
(University of Missouri Health Care, Columbia, USA)
- William Stoecker
(The Dermatology Center, Missouri University of Science and Technology, Rolla, USA)
Abstract
Cervical cancer is the second most common cancer affecting women worldwide but is curable if diagnosed early. Routinely, expert pathologists visually examine histology slides for assessing cervix tissue abnormalities. A localized, fusion-based, hybrid imaging and deep learning approach is explored to classify squamous epithelium into cervical intraepithelial neoplasia (CIN) grades for a dataset of 83 digitized histology images. Partitioning the epithelium region into 10 vertical segments, 27 handcrafted image features and rectangular patch, sliding window-based convolutional neural network features are computed for each segment. The imaging and deep learning patch features are combined and used as inputs to a secondary classifier for individual segment and whole epithelium classification. The hybrid method achieved a 15.51% and 11.66% improvement over the deep learning and imaging approaches alone, respectively, with a 80.72% whole epithelium CIN classification accuracy, showing the enhanced epithelium CIN classification potential of fusing image and deep learning features.
Suggested Citation
Haidar A. AlMubarak & Joe Stanley & Peng Guo & Rodney Long & Sameer Antani & George Thoma & Rosemary Zuna & Shelliane Frazier & William Stoecker, 2019.
"A Hybrid Deep Learning and Handcrafted Feature Approach for Cervical Cancer Digital Histology Image Classification,"
International Journal of Healthcare Information Systems and Informatics (IJHISI), IGI Global, vol. 14(2), pages 66-87, April.
Handle:
RePEc:igg:jhisi0:v:14:y:2019:i:2:p:66-87
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:igg:jhisi0:v:14:y:2019:i:2:p:66-87. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journal Editor (email available below). General contact details of provider: https://www.igi-global.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.