Author
Listed:
- Dingyuan Hu
- Shiya Qu
- Yuhang Jiang
- Chunyu Han
- Hongbin Liang
- Qingyan Zhang
Abstract
Brain extraction is an important prerequisite for the automated diagnosis of intracranial lesions and determines, to a certain extent, the accuracy of subsequent lesion identification, localization, and segmentation. To address the problem that the current traditional image segmentation methods are fast in extraction but poor in robustness, while the Full Convolutional Neural Network (FCN) is robust and accurate but relatively slow in extraction, this paper proposes an adaptive mask-based brain extraction method, namely AMBBEM, to achieve brain extraction better. The method first uses threshold segmentation, median filtering, and closed operations for segmentation, generates a mask for the first time, then combines the ResNet50 model, region growing algorithm, and image properties analysis to further segment the mask, and finally complete brain extraction by multiplying the original image and the mask. The algorithm was tested on 22 test sets containing different lesions, and the results showed MPA = 0.9963, MIoU = 0.9924, and MBF = 0.9914, which were equivalent to the extraction effect of the Deeplabv3+ model. However, the method can complete brain extraction of approximately 6.16 head CT images in 1 second, much faster than Deeplabv3+, U-net, and SegNet models. In summary, this method can achieve accurate brain extraction from head CT images more quickly, creating good conditions for subsequent brain volume measurement and feature extraction of intracranial lesions.
Suggested Citation
Dingyuan Hu & Shiya Qu & Yuhang Jiang & Chunyu Han & Hongbin Liang & Qingyan Zhang, 2024.
"Adaptive mask-based brain extraction method for head CT images,"
PLOS ONE, Public Library of Science, vol. 19(3), pages 1-19, March.
Handle:
RePEc:plo:pone00:0295536
DOI: 10.1371/journal.pone.0295536
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0295536. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.