6,833 research outputs found

    3D Face Recognition for Cows

    Get PDF

    3D Face Recognition for Cows

    Get PDF

    CORF3D contour maps with application to Holstein cattle recognition using RGB and thermal images

    Get PDF
    Livestock management involves the monitoring of farm animals by tracking certain physiological and phenotypical characteristics over time. In the dairy industry, for instance, cattle are typically equipped with RFID ear tags. The corresponding data (e.g. milk properties) can then be automatically assigned to the respective cow when they enter the milking station. In order to move towards a more scalable, affordable, and welfare-friendly approach, automatic non-invasive solutions are more desirable. Thus, a non-invasive approach is proposed in this paper for the automatic identification of individual Holstein cattle from the side view while exiting a milking station. It considers input images from a thermal-RGB camera. The thermal images are used to delineate the cow from the background. Subsequently, any occluding rods from the milking station are removed and inpainted with the fast marching algorithm. Then, it extracts the RGB map of the segmented cattle along with a novel CORF3D contour map. The latter contains three contour maps extracted by the Combination of Receptive Fields (CORF) model with different strengths of push-pull inhibition. This mechanism suppresses noise in the form of grain type texture. The effectiveness of the proposed approach is demonstrated by means of experiments using a 5-fold and a leave-one day-out cross-validation on a new data set of 3694 images of 383 cows collected from the Dairy Campus in Leeuwarden (the Netherlands) over 9 days. In particular, when combining RGB and CORF3D maps by late fusion, an average accuracy of was obtained for the 5-fold cross validation and for the leave–one day–out experiment. The two maps were combined by first learning two ConvNet classification models, one for each type of map. The feature vectors in the two FC layers obtained from training images were then concatenated and used to learn a linear SVM classification model. In principle, the proposed approach with the novel CORF3D contour maps is suitable for various image classification applications, especially where grain type texture is a confounding variable

    Visual identification of individual Holstein-Friesian cattle via deep metric learning

    Get PDF
    Holstein-Friesian cattle exhibit individually-characteristic black and white coat patterns visually akin to those arising from Turing's reaction-diffusion systems. This work takes advantage of these natural markings in order to automate visual detection and biometric identification of individual Holstein-Friesians via convolutional neural networks and deep metric learning techniques. Existing approaches rely on markings, tags or wearables with a variety of maintenance requirements, whereas we present a totally hands-off method for the automated detection, localisation, and identification of individual animals from overhead imaging in an open herd setting, i.e. where new additions to the herd are identified without re-training. We propose the use of SoftMax-based reciprocal triplet loss to address the identification problem and evaluate the techniques in detail against fixed herd paradigms. We find that deep metric learning systems show strong performance even when many cattle unseen during system training are to be identified and re-identified - achieving 98.2% accuracy when trained on just half of the population. This work paves the way for facilitating the non-intrusive monitoring of cattle applicable to precision farming and surveillance for automated productivity, health and welfare monitoring, and to veterinary research such as behavioural analysis, disease outbreak tracing, and more. Key parts of the source code, network weights and underpinning datasets are available publicly.Comment: 37 pages, 14 figures, 2 tables; Submitted to Computers and Electronics in Agriculture; Source code and network weights available at https://github.com/CWOA/MetricLearningIdentification; OpenCows2020 dataset available at https://doi.org/10.5523/bris.10m32xl88x2b61zlkkgz3fml1

    Machine-learning-based calving prediction from activity, lying, and ruminating behaviors in dairy cattle

    Get PDF
    The objective of this study was to use automated activity, lying, and rumination monitors to characterize prepartum behavior and predict calving in dairy cattle. Data were collected from 20 primiparous and 33 multiparous Holstein dairy cattle from September 2011 to May 2013 at the University of Kentucky Coldstream Dairy. The HR Tag (SCR Engineers Ltd., Netanya, Israel) automatically collected neck activity and rumination data in 2-h increments. The IceQube (IceRobotics Ltd., South Queensferry, United Kingdom) automatically collected number of steps, lying time, standing time, number of transitions from standing to lying (ly-. ing bouts), and total motion, summed in 15-min increments. IceQube data were summed in 2-h increments to match HR Tag data. All behavioral data were collected for 14 d before the predicted calving date. Retrospective data analysis was performed using mixed linear models to examine behavioral changes by day in the 14 d before calving. Bihourly behavioral differences from baseline values over the 14 d before calving were also evaluated using mixed linear models. Changes in daily rumination time, total motion, lying time, and lying bouts occurred in the 14 d before calving. In the bihourly analysis, extreme values for all behaviors occurred in the final 24 h, indicating that the monitored behaviors may be useful in calving prediction. To determine whether technologies were useful at predicting calving, random forest, linear discriminant analysis, and neural network machine -learning techniques were constructed and implemented using R version 3.1.0 (R Foundation for Statistical Computing, Vienna, Austria). These methods were used on variables from each technology and all combined variables from both technologies. A neural network analysis that combined variables from both technologies at the daily level yielded 100.0% sen-sitivity and 86.8% specificity. A neural network analysis that combined variables from both technologies in bihourly increments was used to identify 2-h periods in the 8 h before calving with 82.8% sensitivity and 80.4% specificity. Changes in behavior and machine-learning alerts indicate that commercially marketed behavioral monitors may have calving prediction potential

    Smart Computing and Sensing Technologies for Animal Welfare: A Systematic Review

    Get PDF
    Animals play a profoundly important and intricate role in our lives today. Dogs have been human companions for thousands of years, but they now work closely with us to assist the disabled, and in combat and search and rescue situations. Farm animals are a critical part of the global food supply chain, and there is increasing consumer interest in organically fed and humanely raised livestock, and how it impacts our health and environmental footprint. Wild animals are threatened with extinction by human induced factors, and shrinking and compromised habitat. This review sets the goal to systematically survey the existing literature in smart computing and sensing technologies for domestic, farm and wild animal welfare. We use the notion of \emph{animal welfare} in broad terms, to review the technologies for assessing whether animals are healthy, free of pain and suffering, and also positively stimulated in their environment. Also the notion of \emph{smart computing and sensing} is used in broad terms, to refer to computing and sensing systems that are not isolated but interconnected with communication networks, and capable of remote data collection, processing, exchange and analysis. We review smart technologies for domestic animals, indoor and outdoor animal farming, as well as animals in the wild and zoos. The findings of this review are expected to motivate future research and contribute to data, information and communication management as well as policy for animal welfare

    Lessons Learned from the Canadian Cattle Industry: National Animal Identification and The Mad Cow

    Get PDF
    Canada implemented a national cattle identification system, led and developed by the industry. Initially a voluntary program beginning in July 2001, it became mandatory in July 2002 and achieved 92-95 percent compliance by that fall. The costs to develop and initiate the system were low; animals are tagged before leaving the farm of origin and the tags are read when the animal dies or is exported. The national identification system did not protect Canadian cattle from a sole case of bovine spongiform encephalopathy (BSE), or Mad Cow Disease, found in the spring of 2003, but it did help speed and lend confidence to the investigation. While the identification system was the objective of the study, the team also reports on how markets and an industry behave in a crisis

    Continuous real-time cow identification by reading ear tags from live-stream video

    Get PDF
    In precision dairy farming there is a need for continuous and real-time availability of data on cows and systems. Data collection using sensors is becoming more common and it can be difficult to connect sensor measurements to the identification of the individual cow that was measured. Cows can be identified by RFID tags, but ear tags with identification numbers are more widely used. Here we describe a system that makes the ear tag identification of the cow continuously available from a live-stream video so that this information can be added to other data streams that are collected in real-time. An ear tag reading model was implemented by retraining and existing model, and tested for accuracy of reading the digits on cows ear tag images obtained from two dairy farms. The ear tag reading model was then combined with a video set up in a milking robot on a dairy farm, where the identification by the milking robot was considered ground-truth. The system is reporting ear tag numbers obtained from live-stream video in real-time. Retraining a model using a small set of 750 images of ear tags increased the digit level accuracy to 87% in the test set. This compares to 80% accuracy obtained with the starting model trained on images of house numbers only. The ear tag numbers reported by real-time analysis of live-stream video identified the right cow 93% of the time. Precision and sensitivity were lower, with 65% and 41%, respectively, meaning that 41% of all cow visits to the milking robot were detected with the correct cow’s ear tag number. Further improvement in sensitivity needs to be investigated but when ear tag numbers are reported they are correct 93% of the time which is a promising starting point for future system improvements
    • …
    corecore