IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0255586.html
   My bibliography  Save this article

Deep neural networks based automated extraction of dugong feeding trails from UAV images in the intertidal seagrass beds

Author

Listed:
  • Chiaki Yamato
  • Kotaro Ichikawa
  • Nobuaki Arai
  • Kotaro Tanaka
  • Takahiro Nishiyama
  • Kongkiat Kittiwattanawong

Abstract

Dugongs (Dugong dugon) are seagrass specialists distributed in shallow coastal waters in tropical and subtropical seas. The area and distribution of the dugongs’ feeding trails, which are unvegetated winding tracks left after feeding, have been used as an indicator of their feeding ground utilization. However, current ground-based measurements of these trails require a large amount of time and effort. Here, we developed effective methods to observe the dugongs’ feeding trails using unmanned aerial vehicle (UAV) images (1) by extracting the dugong feeding trails using deep neural networks. Furthermore, we demonstrated two applications as follows; (2) extraction of the daily new feeding trails with deep neural networks and (3) estimation the direction of the feeding trails. We obtained aerial photographs from the intertidal seagrass bed at Talibong Island, Trang Province, Thailand. The F1 scores, which are a measure of binary classification model’s accuracy taking false positives and false negatives into account, for the method (1) were 89.5% and 87.7% for the images with ground sampling resolutions of 1 cm/pixel and 0.5 cm/pixel, respectively, while the F1 score for the method (2) was 61.9%. The F1 score for the method (1) was high enough to perform scientific studies on the dugong. However, the method (2) should be improved, and there remains a need for manual correction. The mean area of the extracted daily new feeding trails from September 12–27, 2019, was 187.8 m2 per day (n = 9). Total 63.9% of the feeding trails was estimated to have direction within a range of 112.5° and 157.5°. These proposed new methods will reduce the time and efforts required for future feeding trail observations and contribute to future assessments of the dugongs’ seagrass habitat use.

Suggested Citation

  • Chiaki Yamato & Kotaro Ichikawa & Nobuaki Arai & Kotaro Tanaka & Takahiro Nishiyama & Kongkiat Kittiwattanawong, 2021. "Deep neural networks based automated extraction of dugong feeding trails from UAV images in the intertidal seagrass beds," PLOS ONE, Public Library of Science, vol. 16(8), pages 1-24, August.
  • Handle: RePEc:plo:pone00:0255586
    DOI: 10.1371/journal.pone.0255586
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0255586
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0255586&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0255586?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0255586. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.