Human–animal interactions may affect the animal welfare and productivity in rearing environments. Previously proposed human–animal-related techniques focus on the manual discrimination of single animal behaviors or simple human–animal interactions. To… Click to show full abstract
Human–animal interactions may affect the animal welfare and productivity in rearing environments. Previously proposed human–animal-related techniques focus on the manual discrimination of single animal behaviors or simple human–animal interactions. To address the automatic detection and classification of complex animal behaviors and the animals reactions to human, we propose an approach built upon both the visual representation with Fisher vectors and the end-to-end generative hidden Markov model to facilitate the discrimination of both coarse- and fine-grained animal–human interactions. To satisfy the requirement for abundant data samples of the generative approach, we recorded and annotated more than 480 hours of videos featuring eight persons and 210 laying hens during the process of feeding and cleaning. The experimental results show that the proposed method outperforms state-of-the-art approaches. According to the experimental performance of our method on practical videos, our approach can be used to monitor the human–animal interactions or animal behaviors in modern poultry farms.
               
Click one of the above tabs to view related content.