Author
Listed:
- Qiuyue Yang
(School of Mechanical Engineering, Jiangsu University, Zhenjiang 212013, China)
- Jinan Gu
(School of Mechanical Engineering, Jiangsu University, Zhenjiang 212013, China)
- Tao Xiong
(School of Mechanical Engineering, Jiangsu University, Zhenjiang 212013, China)
- Qihang Wang
(School of Mechanical Engineering, Jiangsu University, Zhenjiang 212013, China)
- Juan Huang
(School of Mechanical Engineering, Jiangsu University, Zhenjiang 212013, China)
- Yidan Xi
(School of Mechanical Engineering, Jiangsu University, Zhenjiang 212013, China)
- Zhongkai Shen
(School of Mechanical Engineering, Jiangsu University, Zhenjiang 212013, China)
Abstract
Accurate detection of tea shoots in natural environments is crucial for facilitating intelligent tea picking, field management, and automated harvesting. However, the detection performance of existing methods in complex scenes remains limited due to factors such as the small size, high density, severe overlap, and the similarity in color between tea shoots and the background. Consequently, this paper proposes an improved target detection algorithm, RFA-YOLOv8, based on YOLOv8, which aims to enhance the detection accuracy and robustness of tea shoots in natural environments. First, a self-constructed dataset containing images of tea shoots under various lighting conditions is created for model training and evaluation. Second, the multi-scale feature extraction capability of the model is enhanced by introducing RFCAConv along with the optimized SPPFCSPC module, while the spatial perception ability is improved by integrating the RFAConv module. Finally, the EIoU loss function is employed instead of CIoU to optimize the accuracy of the bounding box positioning. The experimental results demonstrate that the improved model achieves 84.1% and 58.7% in mAP@0.5 and mAP@0.5:0.95, respectively, which represent increases of 3.6% and 5.5% over the original YOLOv8. Robustness is evaluated under strong, moderate, and dim lighting conditions, yielding improvements of 6.3% and 7.1%. In dim lighting, mAP@0.5 and mAP@0.5:0.95 improve by 6.3% and 7.1%, respectively. The findings of this research provide an effective solution for the high-precision detection of tea shoots in complex lighting environments and offer theoretical and technical support for the development of smart tea gardens and automated picking.
Suggested Citation
Qiuyue Yang & Jinan Gu & Tao Xiong & Qihang Wang & Juan Huang & Yidan Xi & Zhongkai Shen, 2025.
"RFA-YOLOv8: A Robust Tea Bud Detection Model with Adaptive Illumination Enhancement for Complex Orchard Environments,"
Agriculture, MDPI, vol. 15(18), pages 1-23, September.
Handle:
RePEc:gam:jagris:v:15:y:2025:i:18:p:1982-:d:1753790
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:15:y:2025:i:18:p:1982-:d:1753790. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.