11 research outputs found

    Estimating Sheep Pain Level Using Facial Action Unit Detection

    Get PDF
    Assessing pain levels in animals is a crucial, but time-consuming process in maintaining their welfare. Facial expressions in sheep are an efficient and reliable indicator of pain levels. In this paper, we have extended techniques for recognising human facial expressions to encompass facial action units in sheep, which can then facilitate automatic estimation of pain levels. Our multi-level approach starts with detection of sheep faces, localisation of facial landmarks, normalisation and then extraction of facial features. These are described using Histogram of Oriented Gradients, and then classified using Support Vector Machines. Our experiments show an overall accuracy of 67% on sheep Action Units classification. We argue that with more data, our approach on automated pain level assessment can be generalised to other animals

    Development of an Automated Pain Facial Expression Detection System for Sheep (Ovis Aries).

    Get PDF
    The use of technology to optimize the production and management of each individual animal is becoming key to good farming. There is a need for the real-time systematic detection and control of disease in animals in order to limit the impact on animal welfare and food supply. Diseases such as footrot and mastitis cause significant pain in sheep, and so early detection is vital to ensuring effective treatment and preventing the spread across the flock. Facial expression scoring to assess pain in humans and non-humans is now well utilized, and the Sheep Pain Facial Expression Scale (SPFES) is a tool that can reliably detect pain in this species. The SPFES currently requires manual scoring, leaving it open to observer bias, and it is also time-consuming. The ability of a computer to automatically detect and direct a producer as to where assessment and treatment are needed would increase the chances of controlling the spread of disease. It would also aid in the prevention of resistance across the individual, farm, and landscape at both national and international levels. In this paper, we present our framework for an integrated novel system based on techniques originally applied for human facial expression recognition that could be implemented at the farm level. To the authors' knowledge, this is the first time that this technology has been applied to sheep to assess pain

    Dynamics are Important for the Recognition of Equine Pain in Video

    Full text link
    A prerequisite to successfully alleviate pain in animals is to recognize it, which is a great challenge in non-verbal species. Furthermore, prey animals such as horses tend to hide their pain. In this study, we propose a deep recurrent two-stream architecture for the task of distinguishing pain from non-pain in videos of horses. Different models are evaluated on a unique dataset showing horses under controlled trials with moderate pain induction, which has been presented in earlier work. Sequential models are experimentally compared to single-frame models, showing the importance of the temporal dimension of the data, and are benchmarked against a veterinary expert classification of the data. We additionally perform baseline comparisons with generalized versions of state-of-the-art human pain recognition methods. While equine pain detection in machine learning is a novel field, our results surpass veterinary expert performance and outperform pain detection results reported for other larger non-human species.Comment: CVPR 2019: IEEE Conference on Computer Vision and Pattern Recognitio

    Explainable automated recognition of emotional states from canine facial expressions: the case of positive anticipation and frustration.

    Get PDF
    In animal research, automation of affective states recognition has so far mainly addressed pain in a few species. Emotional states remain uncharted territories, especially in dogs, due to the complexity of their facial morphology and expressions. This study contributes to fill this gap in two aspects. First, it is the first to address dog emotional states using a dataset obtained in a controlled experimental setting, including videos from (n = 29) Labrador Retrievers assumed to be in two experimentally induced emotional states: negative (frustration) and positive (anticipation). The dogs' facial expressions were measured using the Dogs Facial Action Coding System (DogFACS). Two different approaches are compared in relation to our aim: (1) a DogFACS-based approach with a two-step pipeline consisting of (i) a DogFACS variable detector and (ii) a positive/negative state Decision Tree classifier; (2) An approach using deep learning techniques with no intermediate representation. The approaches reach accuracy of above 71% and 89%, respectively, with the deep learning approach performing better. Secondly, this study is also the first to study explainability of AI models in the context of emotion in animals. The DogFACS-based approach provides decision trees, that is a mathematical representation which reflects previous findings by human experts in relation to certain facial expressions (DogFACS variables) being correlates of specific emotional states. The deep learning approach offers a different, visual form of explainability in the form of heatmaps reflecting regions of focus of the network's attention, which in some cases show focus clearly related to the nature of particular DogFACS variables. These heatmaps may hold the key to novel insights on the sensitivity of the network to nuanced pixel patterns reflecting information invisible to the human eye

    Towards a fully automated surveillance of well-being status in laboratory mice using deep learning: Starting with facial expression analysis

    Get PDF
    Assessing the well-being of an animal is hindered by the limitations of efficient communication between humans and animals. Instead of direct communication, a variety of parameters are employed to evaluate the well-being of an animal. Especially in the field of biomedical research, scientifically sound tools to assess pain, suffering, and distress for experimental animals are highly demanded due to ethical and legal reasons. For mice, the most commonly used laboratory animals, a valuable tool is the Mouse Grimace Scale (MGS), a coding system for facial expressions of pain in mice. We aim to develop a fully automated system for the surveillance of post-surgical and post-anesthetic effects in mice. Our work introduces a semi-automated pipeline as a first step towards this goal. A new data set of images of black-furred laboratory mice that were moving freely is used and provided. Images were obtained after anesthesia (with isoflurane or ketamine/xylazine combination) and surgery (castration). We deploy two pre-trained state of the art deep convolutional neural network (CNN) architectures (ResNet50 and InceptionV3) and compare to a third CNN architecture without pre-training. Depending on the particular treatment, we achieve an accuracy of up to 99% for the recognition of the absence or presence of post-surgical and/or post-anesthetic effects on the facial expression

    Geometric morphometrics for the study of facial expressions in non-human animals, using the domestic cat as an exemplar

    Get PDF
    Facial expression is a common channel for the communication of emotion. However, in the case of non-human animals, the analytical methods used to quantify facial expressions can be subjective, relying heavily on extrapolation from human-based systems. Here, we demonstrate how geometric morphometrics can be applied in order to overcome these problems. We used this approach to identify and quantify changes in facial shape associated with pain in a non-human animal species. Our method accommodates individual variability, species-specific facial anatomy, and postural effects. Facial images were captured at four different time points during ovariohysterectomy of domestic short haired cats (n = 29), with time points corresponding to varying intensities of pain. Images were annotated using landmarks specifically chosen for their relationship with underlying musculature, and relevance to cat-specific facial action units. Landmark data were subjected to normalisation before Principal Components (PCs) were extracted to identify key sources of facial shape variation, relative to pain intensity. A significant relationship between PC scores and a well-validated composite measure of post-operative pain in cats (UNESP-Botucatu MCPS tool) was evident, demonstrating good convergent validity between our geometric face model, and other metrics of pain detection. This study lays the foundation for the automatic, objective detection of emotional expressions in a range of non-human animal species

    Explainable automated recognition of emotional states from canine facial expressions: the case of positive anticipation and frustration

    Get PDF
    In animal research, automation of affective states recognition has so far mainly addressed pain in a few species. Emotional states remain uncharted territories, especially in dogs, due to the complexity of their facial morphology and expressions. This study contributes to fill this gap in two aspects. First, it is the first to address dog emotional states using a dataset obtained in a controlled experimental setting, including videos from (n = 29) Labrador Retrievers assumed to be in two experimentally induced emotional states: negative (frustration) and positive (anticipation). The dogs’ facial expressions were measured using the Dogs Facial Action Coding System (DogFACS). Two different approaches are compared in relation to our aim: (1) a DogFACS-based approach with a two-step pipeline consisting of (i) a DogFACS variable detector and (ii) a positive/negative state Decision Tree classifier; (2) An approach using deep learning techniques with no intermediate representation. The approaches reach accuracy of above 71% and 89%, respectively, with the deep learning approach performing better. Secondly, this study is also the first to study explainability of AI models in the context of emotion in animals. The DogFACS-based approach provides decision trees, that is a mathematical representation which reflects previous findings by human experts in relation to certain facial expressions (DogFACS variables) being correlates of specific emotional states. The deep learning approach offers a different, visual form of explainability in the form of heatmaps reflecting regions of focus of the network’s attention, which in some cases show focus clearly related to the nature of particular DogFACS variables. These heatmaps may hold the key to novel insights on the sensitivity of the network to nuanced pixel patterns reflecting information invisible to the human eye

    Why pain is still a welfare issue for farm animals, and how facial expression may be the answer

    Get PDF
    Pain is a sensory and emotional experience that significantly affects animal welfare and has negative impacts on the economics of farming. Pain is often associated with common production diseases such as lameness and mastitis, as well as introduced to the animal through routine husbandry practices such as castration and tail docking. Farm animals are prey species which tend not to overtly express pain or weakness, making recognizing and evaluating pain incredibly difficult. Current methods of pain assessment do not provide information on what the animal is experiencing at that moment in time, only that its experience is having a long term negative impact on its behavior and biological functioning. Measures that provide reliable information about the animals’ affective state in that moment are urgently required; facial expression as a pain assessment tool has this ability. Automation of the detection and analysis of facial expression is currently in development, providing further incentive to use these methods in animal welfare assessment

    Estimating Sheep Pain Level Using Facial Action Unit Detection

    No full text
    Assessing pain levels in animals is a crucial, but time-consuming process in maintaining their welfare. Facial expressions in sheep are an efficient and reliable indicator of pain levels. In this paper, we have extended techniques for recognising human facial expressions to encompass facial action units in sheep, which can then facilitate automatic estimation of pain levels. Our multi-level approach starts with detection of sheep faces, localisation of facial landmarks, normalisation and then extraction of facial features. These are described using Histogram of Oriented Gradients, and then classified using Support Vector Machines. Our experiments show an overall accuracy of 67% on sheep Action Units classification. We argue that with more data, our approach on automated pain level assessment can be generalised to other animals

    Towards the improvement of sheep welfare: Exploring the use of qualitative behavioural assessment (QBA) for the monitoring and assessment of sheep

    Get PDF
    Challenges faced by sheep in Australia in terms of disease, injury and management may compromise not only health and productivity but also welfare. These challenges represent a growing concern for both producers and the public. Hence there is an obvious need for the development of measures to allow producers, who may have limited access to stock or are constrained by time and/or resource availability, to monitor their sheep. There is a clear benefit to producers being able to readily identify animals whose welfare might be compromised and thus are in need of further care. However, the assessment of animal welfare is challenging under commercial conditions and to date, few measures are available to help producers recognise animals in compromised welfare states. Qualitative behavioural assessment (QBA) is an approach that captures the expressive behaviour of an animal, through the integration and summary of details of behavioural events, posture, and movement. In this way, QBA represents a valuable tool that offers insight into the physical and physiological aspects of animal welfare, and when used in conjunction with other key measures helps to provide a more complete and comprehensive picture of an animal’s welfare state. Furthermore, QBA should be used together with other welfare measures, where it has been proposed to guide the interpretation of welfare data. As a welfare tool, QBA has been applied to assess the behavioural expression in numerous livestock species including pigs and cattle, however, this methodology is less well studied in sheep and more work is needed to validate QBA for practical application. The aim of the research described in this thesis was to investigate whether the QBA methodology could be applied to assess the welfare of sheep subject to various welfare issues relevant to the Australian sheep industry. To this end, over four experimental chapters, QBA was applied to video footage captured of sheep in various states of compromised welfare, including those suffering from common injury and diseases; lameness, inappetence, flystrike, and gastro-intestinal parasitism, and those experiencing pain caused by routine husbandry procedures (ear tagging, castration, mulesing, and tail docking). Moreover, in two experimental chapters (Chapters 4 & 6), video footage was captured of sheep in positive welfare states (reduced gastro-intestinal parasite burden, and habituation to human presence). This video footage was also analysed quantitatively and other welfare measures including those of health/disease status, physical condition and locomotive activity were collected for validation purposes in each study. Over four experimental chapters, it was demonstrated that observers, blind to experimental procedures and treatments, can reach a significant consensus in their interpretation and assessment of the behavioural expression of sheep, and that these assessments can relate meaningfully to the welfare state of the animal. In Chapter 3, observers were able to distinguish between flystruck and non-flystruck sheep using the QBA methodology, and the behavioural expression scores given to each sheep corresponded to the severity of strike and the condition of the wool. In Chapter 4, observers identified differences in the behavioural expression of sheep that related to the severity of gastro-intestinal parasitism (subclinical v. clinical). Moreover, it was discovered that the treatment of sheep to lessen gastro-intestinal parasite burden altered the behavioural expression of parasitised sheep. A significant consensus was also reached amongst observers in the assessment of lambs subject to routine husbandry procedures (ear tagging, castration, mulesing, and tail docking) in Chapter 5. Observers were able to distinguish lambs that were subject to these painful husbandry procedures and were administered either a placebo or analgesics (Tri-Solfen® and meloxicam), from the control lambs which were only restrained. Hence suggesting that the pain caused by these husbandry procedures alters the behavioural patterns and demeanour of lambs in a way that is identifiable to observers using the QBA methodology. Lastly, when observers viewed video footage of sheep traversing a walk-over-weigh (WoW) apparatus in Chapter 6, they were able to distinguish sheep that were either lame or habituated to the test apparatus and human presence, from the control animals. However, in this Chapter, observers were not able to distinguish between all treatment groups evaluated based on their behavioural expression, specifically differences in the demeanour of inappetent and control sheep was not evident, nor were observers able to distinguish between lame and habituated sheep. In summary, the research presented in this thesis indicates that assessments of behavioural expression can be used under most of those conditions investigated to distinguish sheep in poor welfare states due to injury or disease, from those that are healthy. Furthermore, it appears that observers can reliably identify differences in behavioural expression related to positive welfare states. This work has detailed the behavioural expression of sheep as perceived by observers and has led to a greater understanding of the behavioural expression of sheep in different welfare states. It appears that through the assessment of demeanour or body language, QBA offers both relevant and valid assessments which may help producers gain an insight into the welfare state of their sheep. It is suggested that when used in conjunction with other select behavioural measures, QBA may represent a valuable tool for producers to improve the welfare of sheep in their care
    corecore