Author
Listed:
- Yasuto Tamura
- Yuzuko Utsumi
- Yuka Miwa
- Masakazu Iwamura
- Koichi Kise
Abstract
Japanese table grapes are quite expensive because their production is highly labor-intensive. In particular, grape berry pruning is a labor-intensive task performed to produce grapes with desirable characteristics. Because it is considered difficult to master, it is desirable to assist new entrants by using information technology to show the recommended berries to cut. In this research, we aim to build a system that identifies which grape berries should be removed during the pruning process. To realize this, the 3D positions of individual grape berries need to be estimated. Our environmental restriction is that bunches hang from trellises at a height of about 1.6 meters in the grape orchards outside. It is hard to use depth sensors in such circumstances, and using an omnidirectional camera with a wide field of view is desired for the convenience of shooting videos. Obtaining 3D information of grape berries from videos is challenging because they have textureless surfaces, highly symmetric shapes, and crowded arrangements. For these reasons, it is hard to use conventional 3D reconstruction methods, which rely on matching local unique features. To satisfy the practical constraints of this task, we extend a deep learning-based unsupervised monocular depth estimation method to an omnidirectional camera and propose using it. Our experiments demonstrate the effectiveness of the proposed method for estimating the 3D positions of grape berries in the wild.
Suggested Citation
Yasuto Tamura & Yuzuko Utsumi & Yuka Miwa & Masakazu Iwamura & Koichi Kise, 2025.
"Unsupervised monocular depth estimation with omnidirectional camera for 3D reconstruction of grape berries in the wild,"
PLOS ONE, Public Library of Science, vol. 20(2), pages 1-18, February.
Handle:
RePEc:plo:pone00:0317359
DOI: 10.1371/journal.pone.0317359
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0317359. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.