Simple Summary Pain assessment in animals depends on the observer’s ability to locate and quantify the pain based on perceptible behavior and physiological patterns. It is currently well established in… Click to show full abstract
Simple Summary Pain assessment in animals depends on the observer’s ability to locate and quantify the pain based on perceptible behavior and physiological patterns. It is currently well established in veterinary medicine that pain trigge rs behavioral changes in animals, and monitoring these changes is important in the assessment of pain and evaluation of the welfare state of an animal. Recently, several studies have been conducted in horses to evaluate their pain based on their expression. However, there was no study that measured the level of pain, whether acute pain and chronic pain could be distinguished, or if other conditions could be mistaken for pain. Thus, studies on pain identification based on various facial expressions that can be misjudged are lacking. In this study, a horse facial expression recognition model was developed to automatically analyze these expressions using deep learning. We captured not only pain expressions but also comfort, tension, and excitation expressions as images by classifying them into four labels: resting horses (RH), horses with pain (HP), horses immediately after exercise (HE), and horseshoeing horses (HH). Furthermore, this study classifies horses’ expressions and presents more objective indicators for animal welfare by analyzing their pain and various expressions. Abstract This study aimed to prove that deep learning can be effectively used for identifying various equine facial expressions as welfare indicators. In this study, a total of 749 horses (healthy: 586 and experiencing pain: 163) were investigated. Moreover, a model for recognizing facial expressions based on images and their classification into four categories, i.e., resting horses (RH), horses with pain (HP), horses immediately after exercise (HE), and horseshoeing horses (HH), was developed. The normalization of equine facial posture revealed that the profile (99.45%) had higher accuracy than the front (97.59%). The eyes–nose–ears detection model achieved an accuracy of 98.75% in training, 81.44% in validation, and 88.1% in testing, with an average accuracy of 89.43%. Overall, the average classification accuracy was high; however, the accuracy of pain classification was low. These results imply that various facial expressions in addition to pain may exist in horses depending on the situation, degree of pain, and type of pain experienced by horses. Furthermore, automatic pain and stress recognition would greatly enhance the identification of pain and other emotional states, thereby improving the quality of equine welfare.
               
Click one of the above tabs to view related content.