Author
Listed:
- Wu, Hao
- Xie, Junyang
- Deng, Weihao
- Lin, Anqi
- Mohamed Shariff, Abdul Rashid
- Akmalov, Shamshodbek
- Wu, Wenbin
- Li, Zhaoliang
- Yu, Qiangyi
- Wang, Qunming
- Zhang, Jian
- Mei, Xin
- Hu, Qiong
Abstract
Automatically extracting cropland field parcels from remote sensing images is crucial for developing smart agriculture. However, notable spatio-spectral differences captured by multiple remote sensing sensors at different times led to the uncertain contour and texture features among large-scale cropland field parcel, posing challenges for robust and high-precision extraction. To address these challenges, we proposed a contour-texture hierarchical feature fusion network (CT-HiffNet) for cropland field parcels extraction from high-resolution remote sensing images. The CT-HiffNet consists of three modules: a hybrid module integrating attention and guidance method to thoroughly learn the internal texture features as well as external contour features of cropland field parcels; a deep residual shrinkage block for feature encoding to effectively eliminate redundant information during the extraction tasks; and a hierarchical information fusion decoder to enhance contour-texture feature interactions at different scales and minimize information loss during feature restoration. The CT-HiffNet was evaluated across four distinct agricultural landscape regions in China using GaoFen-2 images, as well as in six other global regions using Sentinel-2 and Google Earth images. The results show that CT-HiffNet achieves OA, precision, and recall all exceeding 80% across various regions in China, and in other global validation areas, precision and recall surpass 84% and 86.5%, respectively. This demonstrates its effectiveness in extracting cropland field parcels and indicates the model’s strong transferability and generalization capability. In particularly, the contour–texture feature effectively enhanced the boundary recognition of cropland field parcels, contributing to the model adaptability to different acquirement times of remote sensing images. Meanwhile, determining an appropriate sample size is crucial for the performance of CT-HiffNet.
Suggested Citation
Wu, Hao & Xie, Junyang & Deng, Weihao & Lin, Anqi & Mohamed Shariff, Abdul Rashid & Akmalov, Shamshodbek & Wu, Wenbin & Li, Zhaoliang & Yu, Qiangyi & Wang, Qunming & Zhang, Jian & Mei, Xin & Hu, Qiong, 2025.
"CT-HiffNet: A contour-texture hierarchical feature fusion network for cropland field parcel extraction from high-resolution remote sensing images,"
EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 239.
Handle:
RePEc:zbw:espost:327149
DOI: 10.1016/j.compag.2025.111010
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:zbw:espost:327149. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ZBW - Leibniz Information Centre for Economics (email available below). General contact details of provider: https://edirc.repec.org/data/zbwkide.html .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.