Author
Listed:
- Riqiang Chen
(Nongxin Science & Technology (Beijing) Co., Ltd., Beijing 100097, China
Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
School of Information Science and Technology, Beijing Forestry University, Beijing 100083, China)
- Lipeng Ren
(Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
Research Institute of Quantitative Remote Sensing and Smart Agriculture, School of Surveying and Mapping Land Information Engineering, Henan Polytechnic University, Jiaozuo 454000, China)
- Guijun Yang
(Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China)
- Zhida Cheng
(Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China)
- Dan Zhao
(Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China)
- Chengjian Zhang
(Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
School of Information Science and Technology, Beijing Forestry University, Beijing 100083, China)
- Haikuan Feng
(Nongxin Science & Technology (Beijing) Co., Ltd., Beijing 100097, China
Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
College of Agriculture, Nanjing Agricultural University, Nanjing 210095, China)
- Haitang Hu
(Nongxin Science & Technology (Beijing) Co., Ltd., Beijing 100097, China
Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China)
- Hao Yang
(Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China)
Abstract
Leaf chlorophyll content (LCC) serves as a vital biochemical indicator of photosynthetic activity and nitrogen status, critical for precision agriculture to optimize crop management. While UAV-based hyperspectral sensing offers maize LCC estimation potential, current methods struggle with overlapping spectral bands and suboptimal model accuracy. To address these limitations, we proposed an integrated maize LCC estimation framework combining UAV hyperspectral imagery, simulated hyperspectral data, E2D-COS feature selection, deep neural network (DNN), and transfer learning (TL). The E2D-COS algorithm with simulated data was used to identify structure-resistant spectral bands strongly correlated with maize LCC: Big trumpet stage: 418 nm, 453 nm, 506 nm, 587 nm, 640 nm, 688 nm, and 767 nm; Spinning stage: 418 nm, 453 nm, 541 nm, 559 nm, 688 nm, 723 nm, and 767 nm. Combining the E2D-COS feature selection with TL and DNN significantly improves the estimation accuracy: the R 2 of the proposed Maize-LCNet model is improved by 0.06–0.11 and the RMSE is reduced by 0.57–1.06 g/cm compared with LCNet-field. Compared to the existing studies, this study not only clarifies the spectral bands that are able to estimate maize chlorophyll, but also presents a high-performance, lightweight (fewer input) approach to achieve the accurate estimation of LCC in maize, which can directly support growth monitoring nutrient management at specific growth stages, thus contributing to smart agricultural practices.
Suggested Citation
Riqiang Chen & Lipeng Ren & Guijun Yang & Zhida Cheng & Dan Zhao & Chengjian Zhang & Haikuan Feng & Haitang Hu & Hao Yang, 2025.
"Estimation of Leaf Chlorophyll Content of Maize from Hyperspectral Data Using E2D-COS Feature Selection, Deep Neural Network, and Transfer Learning,"
Agriculture, MDPI, vol. 15(10), pages 1-27, May.
Handle:
RePEc:gam:jagris:v:15:y:2025:i:10:p:1072-:d:1656986
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:15:y:2025:i:10:p:1072-:d:1656986. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.