315 research outputs found

    Audio recordings dataset of grazing jaw movements in dairy cattle.

    Get PDF
    This dataset is composed of correlated audio recordings and labels of ingestive jaw movements performed during grazing by dairy cattle. Using a wireless microphone, we recorded sounds of three Holstein dairy cows grazing short and tall alfalfa and short and tall fescue. Two experts in grazing behavior identified and labeled the start, end, and type of each jaw movement: bite, chew, and chew-bite (compound movement). For each segment of raw audio corresponding to a jaw movement we computed four well-known features: amplitude, duration, zero crossings, and envelope symmetry. These features are in the dataset and can be used as inputs to build automated methods for classification of ingestive jaw movements. Cow's grazing behavior can be monitored and characterized by identifying and analyzing these masticatory events

    Classifying Ingestive Behavior of Dairy Cows via Automatic Sound Recognition

    Get PDF
    Determining ingestive behaviors of dairy cows is critical to evaluate their productivity and health status. The objectives of this research were to (1) develop the relationship between forage species/heights and sound characteristics of three different ingestive behaviors (bites, chews, and chew-bites); (2) comparatively evaluate three deep learning models and optimization strategies for classifying the three behaviors; and (3) examine the ability of deep learning modeling for classifying the three ingestive behaviors under various forage characteristics. The results show that the amplitude and duration of the bite, chew, and chew-bite sounds were mostly larger for tall forages (tall fescue and alfalfa) compared to their counterparts. The long short-term memory network using a filtered dataset with balanced duration and imbalanced audio files offered better performance than its counterparts. The best classification performance was over 0.93, and the best and poorest performance difference was 0.4–0.5 under different forage species and heights. In conclusion, the deep learning technique could classify the dairy cow ingestive behaviors but was unable to differentiate between them under some forage characteristics using acoustic signals. Thus, while the developed tool is useful to support precision dairy cow management, it requires further improvement

    Utilization of information and communication technologies to monitor grazing behaviour in sheep

    Get PDF
    This thesis is a contribution on the study of feeding behaviour of grazing sheep and its general goal was to evaluate the effectiveness of a tri-axial accelerometer based sensor in the discrimination of the main activities of sheep at pasture, the quantification of the number of bites and the estimation of intake per bite. Based on the literature, it has been observed that feed intake at pasture is a difficult parameter to measure with direct observation, for this reason automated systems for monitoring the activities of free-ranging animals have became increasingly important and common. Among these systems, tri-axial accelerometers showed a good precision and accuracy in the classification of behavioural activities of herbivores, but they do not yet seem able to discriminate jaw movements, which are of great importance for evaluating animal grazing strategies in different pastures and for estimating the daily herbage intake. Thus, the main objective of this research was to develop and test a tri-axial accelerometer based sensor (BEHARUM) for the study of the feeding behaviour of sheep and for the estimation of the bite rate (number of bites per min of grazing) on the basis of acceleration variables. The thesis is organized in 4 main chapters. Chapter 1. This introduction section reports a literature review on the importance of studying the feeding behaviour of ruminants and on the measuring techniques developed over the years for its detection, with specific emphasis on accelerometer based sensors, which showed a good precision and accuracy in the classification of behavioural activities of herbivores. Chapter 2. This chapter describes the results of short tests performed in grazing conditions to discriminate three behavioural activities of sheep (grazing, ruminating and resting) on the base of acceleration data collected with the BEHARUM device. The multivariate statistical analysis correctly assigned 93.0% of minutes to behavioural activities. Chapter 3. This part evaluates the effectiveness of the BEHARUM in discriminating between the main behaviours (grazing, ruminating and other activities) of sheep at pasture and to identify the epoch setting (5, 10, 30, 60, 120, 180 and 300 s) with the best performance. Results show that a discriminant analysis can accurately classify important behaviours such as grazing, ruminating and other activities in sheep at pasture, with a better performance in classifying grazing behaviour than ruminating and other activities for all epochs; the most accurate classification in terms of accuracy and Coehn’s k coefficient was achieved with the 30 s epoch length. Chapter 4. This section illustrates the results of a study that aimed to derive a model to predict sheep behavioural variables like number of bites, bite mass, intake and intake rate, on the basis of variables calculated from acceleration data recorded by the BEHARUM. The experiment was carried out using micro-swards of Italian ryegrass (Lolium multiflorum L.), alfalfa (Medicago sativa L.), oat (Avena sativa L.), chicory (Cichorium intibus L.) and a mixture (Italian ryegrass and alfalfa). The sheep were allowed to graze the micro-swards for 6 minutes and the results show that the BEHARUM can accurately estimate with high to moderate precision (r2=0.86 and RMSEP=3%) the number of bites and the herbage intake of sheep short term grazing Mediterranean forages. Finally, the dissertation ends with a summary of the main implications and findings, and a general discussion and conclusions

    An online method for estimating grazing and rumination bouts using acoustic signals in grazing cattle

    Get PDF
    The growth of the world population expected for the next decade will increase the demand for products derived from cattle (i.e., milk and meat). In this sense, precision livestock farming proposes to optimize livestock production using information and communication technologies for monitoring animals. Although there are several methodologies for monitoring foraging behavior, the acoustic method has shown to be successful in previous studies. However, there is no online acoustic method for the recognition of rumination and grazing bouts that can be implemented in a low-cost device. In this study, an online algorithm called bottom-up foraging activity recognizer (BUFAR) is proposed. The method is based on the recognition of jaw movements from sound, which are then analyzed by groups to recognize rumination and grazing bouts. Two variants of the activity recognizer were explored, which were based on a multilayer perceptron (BUFAR-MLP) and a decision tree (BUFAR-DT). These variants were evaluated and compared under the same conditions with a known method for offline analysis. Compared to the former method, the proposed method showed superior results in the estimation of grazing and rumination bouts. The MLP-variant showed the best results, reaching F1-scores higher than 0.75 for both activities. In addition, the MLP-variant outperformed a commercial rumination time estimation system. A great advantage of BUFAR is the low computational cost, which is about 50 times lower than that corresponding to the former method. The good performance and low computational cost makes BUFAR a highly feasible method for real-time execution in a low-cost embedded monitoring system. The advantages provided by this system will allow the development of a portable device for online monitoring of the foraging behavior of ruminants.Fil: Chelotti, Jose Omar. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; ArgentinaFil: Vanrell, Sebastián Rodrigo. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; ArgentinaFil: Martínez Rau, Luciano Sebastián. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; ArgentinaFil: Galli, Julio Ricardo. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Rosario. Instituto de Investigaciones en Ciencias Agrarias de Rosario. Universidad Nacional de Rosario. Facultad de Ciencias Agrarias. Instituto de Investigaciones en Ciencias Agrarias de Rosario; ArgentinaFil: Planisich, Alejandra. Universidad Nacional de Rosario. Facultad de Ciencias Agrarias; ArgentinaFil: Utsumi, Santiago A.. Michigan State University; Estados UnidosFil: Milone, Diego Humberto. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; ArgentinaFil: Giovanini, Leonardo Luis. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; ArgentinaFil: Rufiner, Hugo Leonardo. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; Argentin

    Classification and Analysis of Multiple Cattle Unitary Behaviors and Movements Based on Machine Learning Methods.

    Full text link
    peer reviewedThe behavior of livestock on farms is the primary representation of animal welfare, health conditions, and social interactions to determine whether they are healthy or not. The objective of this study was to propose a framework based on inertial measurement unit (IMU) data from 10 dairy cows to classify unitary behaviors such as feeding, standing, lying, ruminating-standing, ruminating-lying, and walking, and identify movements during unitary behaviors. Classification performance was investigated for three machine learning algorithms (K-nearest neighbors (KNN), random forest (RF), and extreme boosting algorithm (XGBoost)) in four time windows (5, 10, 30, and 60 s). Furthermore, feed tossing, rolling biting, and chewing in the correctly classified feeding segments were analyzed by the magnitude of the acceleration. The results revealed that the XGBoost had the highest performance in the 60 s time window with an average F1 score of 94% for the six unitary behavior classes. The F1 score of movements is 78% (feed tossing), 87% (rolling biting), and 87% (chewing). This framework offers a possibility to explore more detailed movements based on the unitary behavior classification

    Embedded neural network for real-time animal behavior classification

    Get PDF
    Recent biological studies have focused on understanding animal interactions and welfare. To help biolo- gists to obtain animals’ behavior information, resources like wireless sensor networks are needed. More- over, large amounts of obtained data have to be processed off-line in order to classify different behaviors.There are recent research projects focused on designing monitoring systems capable of measuring someanimals’ parameters in order to recognize and monitor their gaits or behaviors. However, network unre- liability and high power consumption have limited their applicability.In this work, we present an animal behavior recognition, classification and monitoring system based ona wireless sensor network and a smart collar device, provided with inertial sensors and an embeddedmulti-layer perceptron-based feed-forward neural network, to classify the different gaits or behaviorsbased on the collected information. In similar works, classification mechanisms are implemented in aserver (or base station). The main novelty of this work is the full implementation of a reconfigurableneural network embedded into the animal’s collar, which allows a real-time behavior classification andenables its local storage in SD memory. Moreover, this approach reduces the amount of data transmittedto the base station (and its periodicity), achieving a significantly improving battery life. The system hasbeen simulated and tested in a real scenario for three different horse gaits, using different heuristics andsensors to improve the accuracy of behavior recognition, achieving a maximum of 81%.Junta de Andalucía P12-TIC-130

    Livestock vocalisation classification in farm soundscapes

    Get PDF
    Livestock vocalisations have been shown to contain information related to animal welfare and behaviour. Automated sound detection has the potential to facilitate a continuous acoustic monitoring system, for use in a range Precision Livestock Farming (PLF) applications. There are few examples of automated livestock vocalisation classification algorithms, and we have found none capable of being easily adapted and applied to different species' vocalisations. In this work, a multi-purpose livestock vocalisation classification algorithm is presented, utilising audio-specific feature extraction techniques, and machine learning models. To test the multi-purpose nature of the algorithm, three separate data sets were created targeting livestock-related vocalisations, namely sheep, cattle, and Maremma sheepdogs. Audio data was extracted from continuous recordings conducted on-site at three different operational farming enterprises, reflecting the conditions of real deployment. A comparison of Mel-Frequency Cepstral Coefficients (MFCCs) and Discrete Wavelet Transform-based (DWT) features was conducted. Classification was determined using a Support Vector Machine (SVM) model. High accuracy was achieved for all data sets (sheep: 99.29%, cattle: 95.78%, dogs: 99.67%). Classification performance alone was insufficient to determine the most suitable feature extraction method for each data set. Computational timing results revealed the DWT-based features to be markedly faster to produce (14.81 - 15.38% decrease in execution time). The results indicate the development of a highly accurate livestock vocalisation classification algorithm, which forms the foundation for an automated livestock vocalisation detection system

    Automated detection of lameness in sheep using machine learning approaches: novel insights into behavioural differences among lame and non-lame sheep

    Get PDF
    Lameness in sheep is the biggest cause of concern regarding poor health and welfare among sheep producing countries. Best practice for lameness relies on rapid treatment, yet there are no objective measures of lameness detection. Use of accelerometers and gyroscopes have been widely used in human activity studies and their use is becoming increasingly common in livestock. In this study, we used 23 datasets (10 non-lame and 13 lame sheep) from an accelerometer and gyroscope-based ear sensor with a sampling frequency of 16 Hz to develop and compare algorithms that can differentiate lameness within three different activities (walking, standing and lying). We show for the first time that features extracted from accelerometer and gyroscope signals can differentiate between lame and non-lame sheep while standing, walking and lying. The random forest algorithm performed best for classifying lameness with accuracy of 84.91% within lying, 81.15% within standing and 76.83% within walking and overall correctly classified over 80% sheep within activities. Both accelerometer and gyroscope-based features ranked among the top 10 features for classification. Our results suggest that novel behavioural differences between lame and non-lame sheep across all three activities could be used to develop an automated system for lameness detection

    Feature Extraction and Random Forest to Identify Sheep Behavior from Accelerometer Data

    Get PDF
    Sensor technologies play an essential part in the agricultural community and many other scientific and commercial communities. Accelerometer signals and Machine Learning techniques can be used to identify and observe behaviours of animals without the need for an exhaustive human observation which is labour intensive and time consuming. This study employed random forest algorithm to identify grazing, walking, scratching, and inactivity (standing, resting) of 8 Hebridean ewes located in Cheshire, Shotwick in the UK. We gathered accelerometer data from a sensor device which was fitted on the collar of the animals. The selection of the algorithm was based on previous research by which random forest achieved the best results among other benchmark techniques. Therefore, in this study, more focus was given to feature engineering to improve prediction performance. Seventeen features from time and frequency domain were calculated from the accelerometer measurements and the magnitude of the acceleration. Feature elimination was utilised in which highly correlated ones were removed, and only nine out of seventeen features were selected. The algorithm achieved an overall accuracy of 99.43% and a kappa value of 98.66%. The accuracy for grazing, walking, scratching, and inactive was 99.08%, 99.13%, 99.90%, and 99.85%, respectively. The overall results showed that there is a significant improvement over previous methods and studies for all mutually exclusive behaviours. Those results are promising, and the technique could be further tested for future real-time activity recognition
    corecore