Objective Animal feeding behavior serves as an essential indicator of animal welfare. This study aims to address the issues of poor recognition accuracy and insufficient feature extraction in cow feeding behavior under complex farming environments, aiming to achieve automatic monitoring of cow feeding behavior.
Method This paper proposed a recognition method based on the improved BCE-YOLO model. By adding three enhancement modules of BiFormer, CoT, and EMA, the feature extraction capability of the YOLOv8 model was enhanced. Furthermore, it was combined with the Deep SORT algorithm, which outperforms Staple and SiameseRPN algorithms, to track the head trajectory of cows during feeding. A total of 11 288 images were extracted from overhead and frontal videos of cows during feeding, divided into training and test sets at a ratio of 6∶1, to form a feeding dataset.
Result The improved BCE-YOLO model achieved precision of 77.73% and 76.32% on the frontal and overhead datasets, respectively, with recall rates of 82.57% and 86.33%, as well as mean average precision values of 83.70% and 76.81%. Compared to the YOLOv8 model, the overall performance of the proposed model was improved by six to eight percentage points. The Deep SORT algorithm also demonstrated one to four percentage points improvement in comprehensive performance compared to Staple and SiameseRPN algorithms. The combination of the improved BCE-YOLO model and Deep SORT target tracking algorithm achieved accurate tracking of cow feeding behavior and effectively suppressed cow ID (Identity document) changes.
Conclusion The proposed method effectively addresses the issues of poor recognition accuracy and insufficient feature extraction in cow feeding behavior under complex farming environments. It provides an important reference for intelligent animal husbandry and precision farming.