IPSF: Advances in automation for poultry
New technology can improve poultry productionEditor's note: The following content is from three presentations at the 2025 International Poultry Scientific Forum that highlighted technological advances that can improve poultry production parameters.
Enhancing mortality detection in broilers
Tanner Thornton and colleagues at the University of Tennessee, USA, evaluated the performance of a rail mounted robotic mortality identification system in a commercial broiler production house and assessed the challenges of implementing such systems in real-world environments.
This research addresses the labor-intensive process of manually identifying mortalities in broiler houses, which the robotic system aims to streamline through computer vision. The experiment followed a randomized complete block design, with enclosures strategically placed under the rail/robot, between feed and water lines, and near sidewalls as treatment blocks. Mortality detection was evaluated both within these enclosures and in open areas outside them. Data collection occurred across two production flocks, with robotic mortality detection rates compared to manual counts.
Results indicated that the robot achieved a significantly higher mortality detection rate within enclosures under the rail (57%) compared to unenclosed mortalities in the same areas (19%). The bottom camera demonstrated superior detection accuracy (42.3%) compared to side cameras (20.0%).
Challenges such as occlusion by live birds, house infrastructure (feeder and drinker lines), and lighting variability limited system performance. Mortality location and enclosure use significantly influenced detection efficacy. This highlights critical barriers to the commercial viability of automated mortality identification systems. While the robotic system shows potential, adjustments to camera positioning, lighting, and software to mitigate occlusion are necessary to enhance performance under commercial conditions.
Detection of early dead embryos
Alin Khaliduzzaman and colleagues at the University of Illinois, USA, explored the application of a hyperspectral imaging system for the early detection and removal of dead chick embryos during incubation. Approximately 5-10% of embryos die due to thermal shock and inappropriate handling during the early incubation stages.
A non-destructive and real-time approach used hyperspectral imaging combined with discriminant analysis to classify live and dead embryos. A hyperspectral camera (400-1000 nanometers) was used for spatial and spectral information of incubated eggs at day four. Over 100 eggs were incubated at 99.0 F and 60% relative humidity. A stage scanning speed of 0.06 centimeters per second with frame rate of 9.9 feet per second and exposure time of 100 milliseconds was maintained.
Hyperspectral imaging offers a noninvasive and high-resolution method to capture detailed spectral information from incubated hatching eggs. The spectral differences of hemoglobin absorbance in the range of 570-580 nanometers should be varied between live and dead chick embryos.
Eggs were classified non-destructively with an accuracy of 90.0% using discriminant analysis. The successful implementation of this methodology could significantly enhance hatchery efficiency, reducing contamination risks, conserving space, and optimizing energy usage.
Machine learning for bodyweight estimation
Mireia Molins and colleagues at Pennsylvania State University, USA, noted that optimal body growth in turkeys is crucial for achieving the objectives of the poultry industry: minimizing bird losses, maintaining flock uniformity, and increasing meat yield with high feed efficiency. However, monitoring requires frequent body weight measurements, which are costly, labor-intensive, and time consuming.
Computer vision and artificial intelligence (AI) have emerged as powerful tools for predicting individual bodyweight in poultry. A longitudinal observational study was carried out to evaluate whether computer vision could be used to estimate bodyweight in turkey hens under realistic commercial settings.
Color and depth images of 25 hens housed in a single pen were captured using a depth camera with a top-down view installed on the ceiling. A similar setup enables consistent measurements without obstructions caused by animals walking in front of each other and is robust against the orientation of the animals with respect to the camera. Turkeys were followed from 40 to 106 days of age.
Manual measurements of chest width, chest length, back length, and bodyweight were recorded three times a week. A correlation of 0.98, 0.97, 0.94, 0.95 was found between bodyweight and age, chest width, back length and chest length respectively. Since back length has a correlation of 0.97 and 0.95 with chest width and chest length, an evaluation was done on the performance of an AI-based model for regression called Gradient Boosting Trees to predict bodyweight only age and back length as input features.
AI metrics for validation were calculated, showing a root mean squared error percentage of 9.47%±1.19%, mean average percentage error of 7.48%±1.09%, and R-squared of 0.96±0.01. Thanks to the use of depth cameras, three-dimensional features across the animal back can be extracted, providing even more details than back length alone. Preliminary results on the automatic detection of birds from images using computer vision show a precision of 0.9 and f1-score of 0.84.