4,083 research outputs found

    Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs

    Full text link
    Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.Comment: Accepted for publication at IEEE International Conference on Robotics and Automation 2018 (ICRA 2018

    Vision-based weed identification with farm robots

    Get PDF
    Robots in agriculture offer new opportunities for real time weed identification and quick removal operations. Weed identification and control remains one of the most challenging task in agriculture, particularly in organic agriculture practices. Considering environmental impacts and food quality, the excess use of chemicals in agriculture for controlling weeds and diseases is decreasing. The cost of herbercides and their field applications must be optimized. As an alternative, a smart weed identification technique followed by the mechanical and thermal weed control can fulfill the organic farmers’ expectations. The smart identification technique works on the concept of ‘shape matching’ and ‘active shape modeling’ of plant and weed leafs. The automated weed detection and control system consists of three major tools. Such as: i) eXcite multispectral camera, ii) LTI image processing library and iii) Hortibot robotic vehicle. The components are combined in Linux interface environment in the eXcite camera associate PC. The laboratory experiments for active shape matching have shown interesting results which will be further enhanced to develop the automated weed detection system. The Hortibot robot will be mounted with the camera unit in the front-end and the mechanical weed remover in the rear-end. The system will be upgraded for intense commercial applications in maize and other row crops

    Remote Sensing for Site-Specific Crop Management: Evaluating the Potential of Digital Multi-Spectral Imagery for Monitoring Crop Variability and Weeds within Paddocks

    Get PDF
    This paper analyses the potential and limitations of airborne remote sensing systems for detecting crop growth variability and weed infestation within paddocks at specified capture times. The detection of areas of crop growth variability can help farmers become aware of regions within their paddock where they may be experiencing above and below average yields due to changes in soil or management conditions. For instance, the early detection of weed infestation within cereal crops is crucial for lessening their impact on the final yield. Transect sampling within a canola paddock of a broad acre agricultural property in the South West of Western Australia was conducted synchronous with the capture of 1m spatial resolution DMSI. The four individual bands (blue, green, red and near- infrared) of the DMSI were correlated with LAI and weed density counts collected in the paddock. Statistical analyses show the LAI of canola had strong negative correlations with the blue (-0.93) and red (-0.89) bands and a strong positive correlation was found with the near-infrared band (0.82). The strong correlations between the canola LAI and selected bands of the DMSI indicate that this may be a suitable technique for monitoring canola variability to derive information layers that can be used in creating meaningful "within-field" management units. Likewise, DMSI could be used as a non-invasive tool for in season crop monitoring. The correlation analysis with the weed density (e.g. self sown wheat, ryegrass and clover) attributed to only one negative weak correlation with the red band (-0.38). The less successful detection of weeds is attributed to the minimal weeddensity within the paddock (e.g. mean 34 plants m-2) and indistinct spectral difference from canola at the early time of imagery capture required by farmers for effective variable rate applications of herbicides.LAI, remote sensing, crop density, vegetation indices, weed mapping., Crop Production/Industries,

    Unmanned Aerial Vehicles (UAVs) in environmental biology: A Review

    Get PDF
    Acquiring information about the environment is a key step during each study in the field of environmental biology at different levels, from an individual species to community and biome. However, obtaining information about the environment is frequently difficult because of, for example, the phenological timing, spatial distribution of a species or limited accessibility of a particular area for the field survey. Moreover, remote sensing technology, which enables the observation of the Earth’s surface and is currently very common in environmental research, has many limitations such as insufficient spatial, spectral and temporal resolution and a high cost of data acquisition. Since the 1990s, researchers have been exploring the potential of different types of unmanned aerial vehicles (UAVs) for monitoring Earth’s surface. The present study reviews recent scientific literature dealing with the use of UAV in environmental biology. Amongst numerous papers, short communications and conference abstracts, we selected 110 original studies of how UAVs can be used in environmental biology and which organisms can be studied in this manner. Most of these studies concerned the use of UAV to measure the vegetation parameters such as crown height, volume, number of individuals (14 studies) and quantification of the spatio-temporal dynamics of vegetation changes (12 studies). UAVs were also frequently applied to count birds and mammals, especially those living in the water. Generally, the analytical part of the present study was divided into following sections: (1) detecting, assessing and predicting threats on vegetation, (2) measuring the biophysical parameters of vegetation, (3) quantifying the dynamics of changes in plants and habitats and (4) population and behaviour studies of animals. At the end, we also synthesised all the information showing, amongst others, the advances in environmental biology because of UAV application. Considering that 33% of studies found and included in this review were published in 2017 and 2018, it is expected that the number and variety of applications of UAVs in environmental biology will increase in the future

    Automatic Model Based Dataset Generation for Fast and Accurate Crop and Weeds Detection

    Full text link
    Selective weeding is one of the key challenges in the field of agriculture robotics. To accomplish this task, a farm robot should be able to accurately detect plants and to distinguish them between crop and weeds. Most of the promising state-of-the-art approaches make use of appearance-based models trained on large annotated datasets. Unfortunately, creating large agricultural datasets with pixel-level annotations is an extremely time consuming task, actually penalizing the usage of data-driven techniques. In this paper, we face this problem by proposing a novel and effective approach that aims to dramatically minimize the human intervention needed to train the detection and classification algorithms. The idea is to procedurally generate large synthetic training datasets randomizing the key features of the target environment (i.e., crop and weed species, type of soil, light conditions). More specifically, by tuning these model parameters, and exploiting a few real-world textures, it is possible to render a large amount of realistic views of an artificial agricultural scenario with no effort. The generated data can be directly used to train the model or to supplement real-world images. We validate the proposed methodology by using as testbed a modern deep learning based image segmentation architecture. We compare the classification results obtained using both real and synthetic images as training data. The reported results confirm the effectiveness and the potentiality of our approach.Comment: To appear in IEEE/RSJ IROS 201
    • …
    corecore