Author
Listed:
- Hong Peng
- Yunfei Hu
- Baocai Yu
- Zhen Zhang
Abstract
Salient Object Detection (SOD) is a fundamental task in computer vision, aiming to identify prominent regions within images. Traditional methods and deep learning-based models often encounter challenges in capturing crucial information in complex scenes, particularly due to inadequate edge feature extraction, which compromises the precise delineation of object contours and boundaries. To address these challenges, we introduce EFCRFNet, a novel multi-scale feature extraction model that incorporates two innovative modules: the Enhanced Conditional Random Field (ECRF) and the Edge Feature Enhancement Module (EFEM). The ECRF module leverages advanced spatial attention mechanisms to enhance multimodal feature fusion, enabling robust detection in complex environments. Concurrently, the EFEM module focuses on refining edge features to strengthen multi-scale feature representation, significantly improving boundary recognition accuracy. Extensive experiments on standard benchmark datasets demonstrate that EFCRFNet achieves notable performance gains across key evaluation metrics, including MAE (0.64%), Fm (1.04%), Em (8.73%), and Sm (7.4%). These results underscore the effectiveness of EFCRFNet in enhancing detection accuracy and optimizing feature fusion, advancing the state of the art in salient object detection.
Suggested Citation
Hong Peng & Yunfei Hu & Baocai Yu & Zhen Zhang, 2025.
"EFCRFNet: A novel multi-scale framework for salient object detection,"
PLOS ONE, Public Library of Science, vol. 20(5), pages 1-23, May.
Handle:
RePEc:plo:pone00:0323757
DOI: 10.1371/journal.pone.0323757
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0323757. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.