IDEAS home Printed from https://ideas.repec.org/a/gam/jagris/v13y2023i4p906-d1128511.html
   My bibliography  Save this article

Deep Learning Application for Crop Classification via Multi-Temporal Remote Sensing Images

Author

Listed:
  • Qianjing Li

    (International Institute for Earth System Science, Nanjing University, Nanjing 210023, China
    Jiangsu Provincial Key Laboratory of Geographic Information Science and Technology, Nanjing University, Nanjing 210023, China)

  • Jia Tian

    (International Institute for Earth System Science, Nanjing University, Nanjing 210023, China
    School of Instrumentation and Optoelectronic Engineering, Beihang University, Beijing 100191, China)

  • Qingjiu Tian

    (International Institute for Earth System Science, Nanjing University, Nanjing 210023, China
    Jiangsu Provincial Key Laboratory of Geographic Information Science and Technology, Nanjing University, Nanjing 210023, China)

Abstract

The combination of multi-temporal images and deep learning is an efficient way to obtain accurate crop distributions and so has drawn increasing attention. However, few studies have compared deep learning models with different architectures, so it remains unclear how a deep learning model should be selected for multi-temporal crop classification, and the best possible accuracy is. To address this issue, the present work compares and analyzes a crop classification application based on deep learning models and different time-series data to exploit the possibility of improving crop classification accuracy. Using Multi-temporal Sentinel-2 images as source data, time-series classification datasets are constructed based on vegetation indexes (VIs) and spectral stacking, respectively, following which we compare and evaluate the crop classification application based on time-series datasets and five deep learning architectures: (1) one-dimensional convolutional neural networks (1D-CNNs), (2) long short-term memory (LSTM), (3) two-dimensional-CNNs (2D-CNNs), (4) three-dimensional-CNNs (3D-CNNs), and (5) two-dimensional convolutional LSTM (ConvLSTM2D). The results show that the accuracy of both 1D-CNN (92.5%) and LSTM (93.25%) is higher than that of random forest (~ 91%) when using a single temporal feature as input. The 2D-CNN model integrates temporal and spatial information and is slightly more accurate (94.76%), but fails to fully utilize its multi-spectral features. The accuracy of 1D-CNN and LSTM models integrated with temporal and multi-spectral features is 96.94% and 96.84%, respectively. However, neither model can extract spatial information. The accuracy of 3D-CNN and ConvLSTM2D models is 97.43% and 97.25%, respectively. The experimental results show limited accuracy for crop classification based on single temporal features, whereas the combination of temporal features with multi-spectral or spatial information significantly improves classification accuracy. The 3D-CNN and ConvLSTM2D models are thus the best deep learning architectures for multi-temporal crop classification. However, the ConvLSTM architecture combining recurrent neural networks and CNNs should be further developed for multi-temporal image crop classification.

Suggested Citation

  • Qianjing Li & Jia Tian & Qingjiu Tian, 2023. "Deep Learning Application for Crop Classification via Multi-Temporal Remote Sensing Images," Agriculture, MDPI, vol. 13(4), pages 1-19, April.
  • Handle: RePEc:gam:jagris:v:13:y:2023:i:4:p:906-:d:1128511
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2077-0472/13/4/906/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2077-0472/13/4/906/
    Download Restriction: no
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jibo Yue & Chengquan Zhou & Haikuan Feng & Yanjun Yang & Ning Zhang, 2023. "Novel Applications of Optical Sensors and Machine Learning in Agricultural Monitoring," Agriculture, MDPI, vol. 13(10), pages 1-4, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:13:y:2023:i:4:p:906-:d:1128511. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.