Author
Listed:
- Guangjie Liu
- Qi Wang
- Jinlong Zhu
- Haotong Hong
Abstract
In the latest research progress, deep neural networks have been revolutionized by frameworks to extract image features more accurately. In this study, we focus on an attention model that can be useful in deep neural networks and propose a simple but strong feature extraction deep network architecture, W-Net. The architecture of our W-Net network has two mutually independent path structures, and it is designed with the following advantages. (1) There are two independent effective paths in our proposed network structure, and the two paths capture more contextual information from different scales in different ways. (2) The two paths acquire different feature images, and in the upsampling approach, we use bilinear interpolation thus reducing the feature map distortion phenomenon and integrating the different images processed. (3) The feature image processing is at a bottleneck, and a hierarchical attention module is constructed at the bottleneck by reclassifying after the channel attention module and the spatial attention module, resulting in more efficient and accurate processing of feature images. During the experiment, we also tested iSAID, a massively high spatial resolution remote sensing image dataset, with further experimental data comparison to demonstrate the generality of our method for remote sensor image segmentation.
Suggested Citation
Guangjie Liu & Qi Wang & Jinlong Zhu & Haotong Hong, 2023.
"W-Net: Convolutional neural network for segmenting remote sensing images by dual path semantics,"
PLOS ONE, Public Library of Science, vol. 18(7), pages 1-16, July.
Handle:
RePEc:plo:pone00:0288311
DOI: 10.1371/journal.pone.0288311
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0288311. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.