Author
Listed:
- Yunxiao Jiang
(College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China)
- Elsayed M. Atwa
(College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
Agricultural Engineering Research Institute, Agricultural Research Center, Giza 12618, Egypt)
- Pengguang He
(College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China)
- Jinhui Zhang
(College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China)
- Mengzui Di
(College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China)
- Jinming Pan
(College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
State Key Laboratory of Agricultural Equipment Technology, Beijing 100083, China)
- Hongjian Lin
(College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
Zhejiang Key Laboratory of Intelligent Sensing and Robotics for Agriculture, 866 Yuhangtang Road, Hangzhou 310058, China)
Abstract
Egg weight monitoring provides critical data for calculating the feed-to-egg ratio, and improving poultry farming efficiency. Installing a computer vision monitoring system in egg collection systems enables efficient and low-cost automated egg weight measurement. However, its accuracy is compromised by egg clustering during transportation and low-contrast edges, which limits the widespread adoption of such methods. To address this, we propose an egg measurement method based on a computer vision and multi-feature extraction and regression approach. The proposed pipeline integrates two artificial neural networks: Central differential-EfficientViT YOLO (CEV-YOLO) and Egg Weight Measurement Network (EWM-Net). CEV-YOLO is an enhanced version of YOLOv11, incorporating central differential convolution (CDC) and efficient Vision Transformer (EfficientViT), enabling accurate pixel-level egg segmentation in the presence of occlusions and low-contrast edges. EWM-Net is a custom-designed neural network that utilizes the segmented egg masks to perform advanced feature extraction and precise weight estimation. Experimental results show that CEV-YOLO outperforms other YOLO-based models in egg segmentation, with a precision of 98.9%, a recall of 97.5%, and an Average Precision (AP) at an Intersection over Union (IoU) threshold of 0.9 (AP90) of 89.8%. EWM-Net achieves a mean absolute error (MAE) of 0.88 g and an R 2 of 0.926 in egg weight measurement, outperforming six mainstream regression models. This study provides a practical and automated solution for precise egg weight measurement in practical production scenarios, which is expected to improve the accuracy and efficiency of feed-to-egg ratio measurement in laying hen farms.
Suggested Citation
Yunxiao Jiang & Elsayed M. Atwa & Pengguang He & Jinhui Zhang & Mengzui Di & Jinming Pan & Hongjian Lin, 2025.
"Computer Vision-Based Multi-Feature Extraction and Regression for Precise Egg Weight Measurement in Laying Hen Farms,"
Agriculture, MDPI, vol. 15(19), pages 1-25, September.
Handle:
RePEc:gam:jagris:v:15:y:2025:i:19:p:2035-:d:1760368
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:15:y:2025:i:19:p:2035-:d:1760368. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.