4,349 research outputs found

    Recording behaviour of indoor-housed farm animals automatically using machine vision technology: a systematic review

    Get PDF
    Large-scale phenotyping of animal behaviour traits is time consuming and has led to increased demand for technologies that can automate these procedures. Automated tracking of animals has been successful in controlled laboratory settings, but recording from animals in large groups in highly variable farm settings presents challenges. The aim of this review is to provide a systematic overview of the advances that have occurred in automated, high throughput image detection of farm animal behavioural traits with welfare and production implications. Peer-reviewed publications written in English were reviewed systematically following Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. After identification, screening, and assessment for eligibility, 108 publications met these specifications and were included for qualitative synthesis. Data collected from the papers included camera specifications, housing conditions, group size, algorithm details, procedures, and results. Most studies utilized standard digital colour video cameras for data collection, with increasing use of 3D cameras in papers published after 2013. Papers including pigs (across production stages) were the most common (n = 63). The most common behaviours recorded included activity level, area occupancy, aggression, gait scores, resource use, and posture. Our review revealed many overlaps in methods applied to analysing behaviour, and most studies started from scratch instead of building upon previous work. Training and validation sample sizes were generally small (mean±s.d. groups = 3.8±5.8) and in data collection and testing took place in relatively controlled environments. To advance our ability to automatically phenotype behaviour, future research should build upon existing knowledge and validate technology under commercial settings and publications should explicitly describe recording conditions in detail to allow studies to be reproduced

    Utilization of Depth - Enabled Identification and Tracking System to Identify and Track Individual Pigs and Analyse Individual Pig Activity

    Get PDF
    Ensuring the health and wellbeing of pigs is of the utmost importance to the swine industry. There is a need for a real-time system that can identify changes in pig activities and activity patterns to accurately identify compromised pigs. The value of a real-time system is the capability to identify compromised pigs prior to observance of visible clinical symptoms by facility personnel. Therefore, a novel computer vision depth-enabled identification and tracking (DeIT) system was evaluated. Evaluation of 10,544 randomly selected frames indicated a 93.9% accuracy rate for identifying pigs’ identity when classified by the system as standing/walking. The accuracy of activities was 99.1% lying, 96.3% standing, 99.3% walking, 86.4% in close proximity to the feeder, and 73.6% in close proximity to the waterer. The average percentage of time spent lying (77.56±1.69%), standing (8.64±1.10%), walking (2.29±0.37%), in close proximity to the feeder (9.93 ± 1.66%), and waterer (0.95±0.28%). Furthermore, m/d walked was 943.1±105.1 m. As the trial progressed, the percentage of time spent lying and time in close proximity to the feeder increased (P≤0.001; 7.46 and 3.99%, respectively). Time standing and walking decreased (9.82 and 1.53%) from wk 1 thru wk 6. Gender had no effect (P ≥ 0.10) on percent of time spent lying, standing, walking, in close proximity to the feeder, m/d walked. Barrows spent a greater (P=0.04) percentage of time than gilts in close proximity to the waterer (1.01% vs. 0.89%, respectively). Litter had no effect (P ≥ 0.10) for time spent lying, standing, in close proximity to the feeder, and waterer. There was a difference related to the percent of time walking (P=0.05) and m/d walked (P=0.05) between litters. Results indicate two significant outcomes, 1) proposed DeIT system has the capability and sensitivity to accurately identify, maintain identification, and track the activities of nursery pigs and 2) accuracy of the DeIT system provides the potential to evaluate changes in activity for an extended period of time. Advisor: Ty B. Schmid

    Monitoring Animal Well-being

    Get PDF

    Deployment and Evaluation of an Active RFID Tracking System for Precision Animal Management

    Get PDF
    A better understanding of animal space utilization in current livestock facilities could lead to improved facility design and animal health. This study was conducted to determine whether an active RFID tag tracking system could accurately provide animal locomotion data on an individual animal basis. The system is composed of four sensors, located in the corners of a swine pen, and compact tags, which attach to the animals and transmit a signal. The sensors use the tag signals to determine 3-D positions in real-time. A data acquisition system was developed to capture raw data from the system software into a database for analysis. The first test was performed with 34 tags placed at a known location, followed by a second test with 34 tags arranged in a 1-m×1-m grid across the pen. Results from the first test were consistent with the manufacturer’s claim of 15 cm accuracy. Error was higher in the second test. The system was used to track pigs for two days. Visual analysis indicated 84.4% tracking accuracy. Finally, the system was used to track animals from different genetic lines and temperaments. Statistical analysis of this data indicated significant differences in movement data based on sex of the animal, lineage, and temperament. Further work revealed that the system is prone to generate large jumps in the data that need to be filtered if the desired use is for instantaneous measurements. Without data filtering, the system is best suited for monitoring hourly or daily average values for animal movement parameters. Advisors: Deepak Keshwani and Tami Brown-Brand

    Deployment and Evaluation of an Active RFID Tracking System for Precision Animal Management

    Get PDF
    A better understanding of animal space utilization in current livestock facilities could lead to improved facility design and animal health. This study was conducted to determine whether an active RFID tag tracking system could accurately provide animal locomotion data on an individual animal basis. The system is composed of four sensors, located in the corners of a swine pen, and compact tags, which attach to the animals and transmit a signal. The sensors use the tag signals to determine 3-D positions in real-time. A data acquisition system was developed to capture raw data from the system software into a database for analysis. The first test was performed with 34 tags placed at a known location, followed by a second test with 34 tags arranged in a 1-m×1-m grid across the pen. Results from the first test were consistent with the manufacturer’s claim of 15 cm accuracy. Error was higher in the second test. The system was used to track pigs for two days. Visual analysis indicated 84.4% tracking accuracy. Finally, the system was used to track animals from different genetic lines and temperaments. Statistical analysis of this data indicated significant differences in movement data based on sex of the animal, lineage, and temperament. Further work revealed that the system is prone to generate large jumps in the data that need to be filtered if the desired use is for instantaneous measurements. Without data filtering, the system is best suited for monitoring hourly or daily average values for animal movement parameters. Advisors: Deepak Keshwani and Tami Brown-Brand

    Robust individual pig tracking

    Get PDF
    The locations of pigs in the group housing enable activity monitoring and improve animal welfare. Vision-based methods for tracking individual pigs are noninvasive but have low tracking accuracy owing to long-term pig occlusion. In this study, we developed a vision-based method that accurately tracked individual pigs in group housing. We prepared and labeled datasets taken from an actual pig farm, trained a faster region-based convolutional neural network to recognize pigs’ bodies and heads, and tracked individual pigs across video frames. To quantify the tracking performance, we compared the proposed method with the global optimization (GO) method with the cost function and the simple online and real-time tracking (SORT) method on four additional test datasets that we prepared, labeled, and made publicly available. The predictive model detects pigs’ bodies accurately, with F1-scores of 0.75 to 1.00, on the four test datasets. The proposed method achieves the largest multi-object tracking accuracy (MOTA) values at 0.75, 0.98, and 1.00 for three test datasets. In the remaining dataset, the proposed method has the second-highest MOTA of 0.73. The proposed tracking method is robust to long-term occlusion, outperforms the competitive baselines in most datasets, and has practical utility in helping to track individual pigs accurately

    Quick, accurate, smart: 3D computer vision technology helps assessing confined animals' behaviour

    Get PDF
    <p>(a) Visual representation of the alignment of two sequences using the Dynamic Time Warping (DTW). The DTW stretches the sequences in time by matching the same point with several points of the compared time series. (b) The Needleman Wunsh (NW) algorithm substitutes the temporal stretch with gap elements (red circles in the table) inserting blank spaces instead of forcefully matching point. The alignment is achieved by arranging the two sequences in this table, the first sequence row-wise (T) and the second column-wise (S). The figure shows a score table for two hypothetical sub-sequences (i, j) and the alignment scores (numbers in cells) for each pair of elements forming the sequence (letters in head row and head column). Arrows show the warping path between the two series and consequently the final alignment. The optimal alignment score is in the bottom-right cell of the table.</p

    Towards on-farm pig face recognition using convolutional neural networks

    Get PDF
    © 2018 Elsevier B.V. Identification of individual livestock such as pigs and cows has become a pressing issue in recent years as intensification practices continue to be adopted and precise objective measurements are required (e.g. weight). Current best practice involves the use of RFID tags which are time-consuming for the farmer and distressing for the animal to fit. To overcome this, non-invasive biometrics are proposed by using the face of the animal. We test this in a farm environment, on 10 individual pigs using three techniques adopted from the human face recognition literature: Fisherfaces, the VGG-Face pre-trained face convolutional neural network (CNN) model and our own CNN model that we train using an artificially augmented data set. Our results show that accurate individual pig recognition is possible with accuracy rates of 96.7% on 1553 images. Class Activated Mapping using Grad-CAM is used to show the regions that our network uses to discriminate between pigs
    • …
    corecore