36 research outputs found

    Multi-Pig Part Detection and Association with a Fully-Convolutional Network

    Get PDF
    Computer vision systems have the potential to provide automated, non-invasive monitoring of livestock animals, however, the lack of public datasets with well-defined targets and evaluation metrics presents a significant challenge for researchers. Consequently, existing solutions often focus on achieving task-specific objectives using relatively small, private datasets. This work introduces a new dataset and method for instance-level detection of multiple pigs in group-housed environments. The method uses a single fully-convolutional neural network to detect the location and orientation of each animal, where both body part locations and pairwise associations are represented in the image space. Accompanying this method is a new dataset containing 2000 annotated images with 24,842 individually annotated pigs from 17 different locations. The proposed method achieves over 99% precision and over 96% recall when detecting pigs in environments previously seen by the network during training. To evaluate the robustness of the trained network, it is also tested on environments and lighting conditions unseen in the training set, where it achieves 91% precision and 67% recall. The dataset is publicly available for download

    Analisa Sistem Pemindai 3d Berbasis Kinect Technology pada Komponen Mesin Pertanian

    Get PDF
    Penelitian ini bertujuan untuk melakukan analisa terhadap sistem pemindai 3D untuk komponen mesin peralatan pertanian menggunakan teknologi kinect. Pemanfaat teknologi kinect ini diharapkan mampu memindai dan mengambil gambar komponen-komponen mesin pertanian secara 3D. Analisa yang digunakan adalah untuk mengetahui tingkat akurasi kinect dalam memindai komponen mesin pertanian. Pemindai 3D ini menggunakan teknologi kinect sebagai sesor utama. Kinect terdiri dari sensor infra red, sensor depth image coms dan sensor color image CMOS. Semua sensor kinect memindai obyek secara 3D dan kemudian diproses oleh processor kinect. Hasil proses tersebut dilanjutkan ke perangkat lunak pada pemroses data sehingga mengasilkan file hasil pemindaian dalam bentuk 3D. Hasil pengujian pada komponen dengan ukuran yang berbeda-beda menunjukan bahwa komponen dengan ukuran dibawah diameter 7,5 dan tinggi 10 cm tidak mampu dipindai. Komponen dengan ukuran diatas diameter 10 cm dan tinggi 15 cm dapat dipindai dengan akurasi yang cukup baik

    Recording behaviour of indoor-housed farm animals automatically using machine vision technology: a systematic review

    Get PDF
    Large-scale phenotyping of animal behaviour traits is time consuming and has led to increased demand for technologies that can automate these procedures. Automated tracking of animals has been successful in controlled laboratory settings, but recording from animals in large groups in highly variable farm settings presents challenges. The aim of this review is to provide a systematic overview of the advances that have occurred in automated, high throughput image detection of farm animal behavioural traits with welfare and production implications. Peer-reviewed publications written in English were reviewed systematically following Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. After identification, screening, and assessment for eligibility, 108 publications met these specifications and were included for qualitative synthesis. Data collected from the papers included camera specifications, housing conditions, group size, algorithm details, procedures, and results. Most studies utilized standard digital colour video cameras for data collection, with increasing use of 3D cameras in papers published after 2013. Papers including pigs (across production stages) were the most common (n = 63). The most common behaviours recorded included activity level, area occupancy, aggression, gait scores, resource use, and posture. Our review revealed many overlaps in methods applied to analysing behaviour, and most studies started from scratch instead of building upon previous work. Training and validation sample sizes were generally small (mean±s.d. groups = 3.8±5.8) and in data collection and testing took place in relatively controlled environments. To advance our ability to automatically phenotype behaviour, future research should build upon existing knowledge and validate technology under commercial settings and publications should explicitly describe recording conditions in detail to allow studies to be reproduced

    Evaluation of a novel computer vision-based livestock monitoring system to identify and track specific behaviors of individual nursery pigs within a group-housed environment

    Get PDF
    Animal behavior is indicative of health status and changes in behavior can indicate health issues (i.e., illness, stress, or injury). Currently, human observation (HO) is the only method for detecting behavior changes that may indicate problems in group-housed pigs. While HO is effective, limitations exist. Limitations include HO being time consuming, HO obfuscates natural behaviors, and it is not possible to maintain continuous HO. To address these limitations, a computer vision platform (NUtrack) was developed to identify (ID) and continuously monitor specific behaviors of group-housed pigs on an individual basis. The objectives of this study were to evaluate the capabilities of the NUtrack system and evaluate changes in behavior patterns over time of group-housed nursery pigs. The NUtrack system was installed above four nursery pens to monitor the behavior of 28 newly weaned pigs during a 42-d nursery period. Pigs were stratified by sex, litter, and randomly assigned to one of two pens (14 pigs/pen) for the first 22 d. On day 23, pigs were split into four pens (7 pigs/pen). To evaluate the NUtrack system’s capabilities, 800 video frames containing 11,200 individual observations were randomly selected across the nursery period. Each frame was visually evaluated to verify the NUtrack system’s accuracy for ID and classification of behavior. The NUtrack system achieved an overall accuracy for ID of 95.6%. This accuracy for ID was 93.5% during the first 22 d and increased (P \u3c 0.001) to 98.2% for the final 20 d. Of the ID errors, 72.2% were due to mislabeled ID and 27.8% were due to loss of ID. The NUtrack system classified lying, standing, walking, at the feeder (ATF), and at the waterer (ATW) behaviors accurately at a rate of 98.7%, 89.7%, 88.5%, 95.6%, and 79.9%, respectively. Behavior data indicated that the time budget for lying, standing, and walking in nursery pigs was 77.7% ± 1.6%, 8.5% ± 1.1%, and 2.9% ± 0.4%, respectively. In addition, behavior data indicated that nursery pigs spent 9.9% ± 1.7% and 1.0% ± 0.3% time ATF and ATW, respectively. Results suggest that the NUtrack system can detect, identify, maintain ID, and classify specific behavior of group-housed nursery pigs for the duration of the 42-d nursery period. Overall, results suggest that, with continued research, the NUtrack system may provide a viable real-time precision livestock tool with the ability to assist producers in monitoring behaviors and potential changes in the behavior of group-housed pigs

    Evaluation of low-cost depth cameras for agricultural applications

    Get PDF
    Low-cost depth-cameras have been used in many agricultural applications with reported advantages of low cost, reliability and speed of measurement. However, some problems were also reported and seem to be technology- related, so understanding the limitations of each type of depth camera technology could provide a basis for technology selection and the development of research involving its use. The cameras use one or a combination of two of the three available technologies: structured light, time-of-flight (ToF), and stereoscopy. The objectives were to evaluate these different technologies for depth sensing, including measuring accuracy and repeatability of distance data and measurements at different positions within the image, and cameras usefulness in indoor and outdoor settings. Then, cameras were tested in a swine facility and in a corn field. Five different cameras were used: (1) Microsoft Kinect v.1, (2) Microsoft Kinect v.2, (3) Intel® RealSenseTM Depth Camera D435, (4) ZED Stereo Camera (StereoLabs), and (5) CamBoard Pico Flexx (PMD Technologies). Results indicate that there were significant camera to camera differences for ZED Stereo Camera and Kinect v.1 camera (p \u3c 0.05). All cameras showed an increase in the standard deviation as the distance between camera and object increased; however, the Intel RealSense camera had a larger increase. Time-of-flight cameras had the smallest error between different sizes of objects. Time-of-flight cameras had non-readable zones on the corners of the images. The results indicate that the ToF technology is the best to be used for indoor applications and stereoscopy is the best technology for outdoor applications

    Evaluation of low-cost depth cameras for agricultural applications

    Get PDF
    Low-cost depth-cameras have been used in many agricultural applications with reported advantages of low cost, reliability and speed of measurement. However, some problems were also reported and seem to be technology related, so understanding the limitations of each type of depth camera technology could provide a basis for technology selection and the development of research involving its use. The cameras use one or a combination of two of the three available technologies: structured light, time-of-flight (ToF), and stereoscopy. The objectives were to evaluate these different technologies for depth sensing, including measuring accuracy and repeatability of distance data and measurements at different positions within the image, and cameras usefulness in indoor and outdoor settings. Then, cameras were tested in a swine facility and in a corn field. Five different cameras were used: (1) Microsoft Kinect v.1, (2) Microsoft Kinect v.2, (3) Intel® RealSense™ Depth Camera D435, (4) ZED Stereo Camera (StereoLabs), and (5) CamBoard Pico Flexx (PMD Technologies). Results indicate that there were significant camera to camera differences for ZED Stereo Camera and Kinect v.1 camera (p \u3c 0.05). All cameras showed an increase in the standard deviation as the distance between camera and object increased; however, the Intel RealSense camera had a larger increase. Time-of-flight cameras had the smallest error between different sizes of objects. Time-of-flight cameras had non-readable zones on the corners of the images. The results indicate that the ToF technolog

    Ground and Aerial Robots for Agricultural Production: Opportunities and Challenges

    Get PDF
    Crop and animal production techniques have changed significantly over the last century. In the early 1900s, animal power was replaced by tractor power that resulted in tremendous improvements in field productivity, which subsequently laid foundation for mechanized agriculture. While precision agriculture has enabled site-specific management of crop inputs for improved yields and quality, precision livestock farming has boosted efficiencies in animal and dairy industries. By 2020, highly automated systems are employed in crop and animal agriculture to increase input efficiency and agricultural output with reduced adverse impact on the environment. Ground and aerial robots combined with artificial intelligence (AI) techniques have potential to tackle the rising food, fiber, and fuel demands of the rapidly growing population that is slated to be around 10 billion by the year 2050. This Issue Paper presents opportunities provided by ground and aerial robots for improved crop and animal production, and the challenges that could potentially limit their progress and adoption. A summary of enabling factors that could drive the deployment and adoption of robots in agriculture is also presented along with some insights into the training needs of the workforce who will be involved in the next-generation agriculture

    Seven Years after the Manifesto: Literature Review and Research Directions for Technologies in Animal Computer Interaction

    Get PDF
    As technologies diversify and become embedded in everyday lives, the technologies we expose to animals, and the new technologies being developed for animals within the field of Animal Computer Interaction (ACI) are increasing. As we approach seven years since the ACI manifesto, which grounded the field within Human Computer Interaction and Computer Science, this thematic literature review looks at the technologies developed for (non-human) animals. Technologies that are analysed include tangible and physical, haptic and wearable, olfactory, screen technology and tracking systems. The conversation explores what exactly ACI is whilst questioning what it means to be animal by considering the impact and loop between machine and animal interactivity. The findings of this review are expected to form the first grounding foundation of ACI technologies informing future research in animal computing as well as suggesting future areas for exploratio

    Ensuring Agricultural Sustainability through Remote Sensing in the Era of Agriculture 5.0

    Get PDF
    This work was supported by the projects: "VIRTUOUS" funded by the European Union's Horizon 2020 Project H2020-MSCA-RISE-2019. Ref. 872181, "SUSTAINABLE" funded by the European Union's Horizon 2020 Project H2020-MSCA-RISE-2020. Ref. 101007702 and the "Project of Excellence" from Junta de Andalucia 2020. Ref. P18-H0-4700. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.Timely and reliable information about crop management, production, and yield is considered of great utility by stakeholders (e.g., national and international authorities, farmers, commercial units, etc.) to ensure food safety and security. By 2050, according to Food and Agriculture Organization (FAO) estimates, around 70% more production of agricultural products will be needed to fulfil the demands of the world population. Likewise, to meet the Sustainable Development Goals (SDGs), especially the second goal of “zero hunger”, potential technologies like remote sensing (RS) need to be efficiently integrated into agriculture. The application of RS is indispensable today for a highly productive and sustainable agriculture. Therefore, the present study draws a general overview of RS technology with a special focus on the principal platforms of this technology, i.e., satellites and remotely piloted aircrafts (RPAs), and the sensors used, in relation to the 5th industrial revolution. Nevertheless, since 1957, RS technology has found applications, through the use of satellite imagery, in agriculture, which was later enriched by the incorporation of remotely piloted aircrafts (RPAs), which is further pushing the boundaries of proficiency through the upgrading of sensors capable of higher spectral, spatial, and temporal resolutions. More prominently, wireless sensor technologies (WST) have streamlined real time information acquisition and programming for respective measures. Improved algorithms and sensors can, not only add significant value to crop data acquisition, but can also devise simulations on yield, harvesting and irrigation periods, metrological data, etc., by making use of cloud computing. The RS technology generates huge sets of data that necessitate the incorporation of artificial intelligence (AI) and big data to extract useful products, thereby augmenting the adeptness and efficiency of agriculture to ensure its sustainability. These technologies have made the orientation of current research towards the estimation of plant physiological traits rather than the structural parameters possible. Futuristic approaches for benefiting from these cutting-edge technologies are discussed in this study. This study can be helpful for researchers, academics, and young students aspiring to play a role in the achievement of sustainable agriculture.European Commission 101007702 872181Junta de Andalucia P18-H0-470
    corecore