Author
Listed:
- Hengtai Li
(College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, China)
- Hongfei Chen
(College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, China)
- Jinlin Liu
(College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, China)
- Qiuhong Zhang
(College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, China)
- Tao Liu
(College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, China)
- Xinyu Zhang
(College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, China)
- Yuhua Li
(College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, China)
- Yan Qian
(College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, China)
- Xiuguo Zou
(College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, China)
Abstract
With the increasing urgency for digital transformation in large-scale caged layer farms, traditional methods for monitoring the environment and chicken health, which often rely on human experience, face challenges related to low efficiency and poor real-time performance. In this study, we focused on caged layer chickens and proposed an improved abnormal beak detection model based on the You Only Look Once v8 (YOLOv8) framework. Data collection was conducted using an inspection robot, enhancing automation and consistency. To address the interference caused by chicken cages, an Efficient Multi-Scale Attention (EMA) mechanism was integrated into the Spatial Pyramid Pooling-Fast (SPPF) module within the backbone network, significantly improving the model’s ability to capture fine-grained beak features. Additionally, the standard convolutional blocks in the neck of the original model were replaced with Grouped Shuffle Convolution (GSConv) modules, effectively reducing information loss during feature extraction. The model was deployed on edge computing devices for the real-time detection of abnormal beak features in layer chickens. Beyond local detection, a digital twin remote monitoring system was developed, combining three-dimensional (3D) modeling, the Internet of Things (IoT), and cloud-edge collaboration to create a dynamic, real-time mapping of physical layer farms to their virtual counterparts. This innovative approach not only improves the extraction of subtle features but also addresses occlusion challenges commonly encountered in small target detection. Experimental results demonstrate that the improved model achieved a detection accuracy of 92.7%. In terms of the comprehensive evaluation metric (mAP), it surpassed the baseline model and YOLOv5 by 2.4% and 3.2%, respectively. The digital twin system also proved stable in real-world scenarios, effectively mapping physical conditions to virtual environments. Overall, this study integrates deep learning and digital twin technology into a smart farming system, presenting a novel solution for the digital transformation of poultry farming.
Suggested Citation
Hengtai Li & Hongfei Chen & Jinlin Liu & Qiuhong Zhang & Tao Liu & Xinyu Zhang & Yuhua Li & Yan Qian & Xiuguo Zou, 2025.
"Deep Learning-Based Detection and Digital Twin Implementation of Beak Deformities in Caged Layer Chickens,"
Agriculture, MDPI, vol. 15(11), pages 1-21, May.
Handle:
RePEc:gam:jagris:v:15:y:2025:i:11:p:1170-:d:1667596
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:15:y:2025:i:11:p:1170-:d:1667596. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.