4 research outputs found

    The use of agricultural robots in weed management and control

    Get PDF
    Weed management and control are essential for the production of high-yielding and high-quality crops, and advances in weed control technology have had a huge impact on agricultural productivity. Any effective weed control technology needs to be both robust and adaptable. Robust weed control technology will successfully control weeds in spite of variability in the field conditions. Adaptable weed control technology has the capacity to change its strategy in the context of evolving weed populations, genetics, and climatic conditions. This chapter focuses on key work in the development of robotic weeders, including weed perception systems and weed control mechanisms. Following an extensive introduction, the chapter addresses the challenges of robotic weed control focusing on both perception systems, which can detect and classify weed plants from crop plants, and also weed control mechanisms, covering both chemical and mechanical weed control. A case study of an automated weeding system is provided

    Automated crop plant detection based on the fusion of color and depth images for robotic weed control

    Get PDF
    Robotic weeding enables weed control near or within crop rows automatically, precisely and effectively. A computer‐vision system was developed for detecting crop plants at different growth stages for robotic weed control. Fusion of color images and depth images was investigated as a means of enhancing the detection accuracy of crop plants under conditions of high weed population. In‐field images of broccoli and lettuce were acquired 3–27 days after transplanting with a Kinect v2 sensor. The image processing pipeline included data preprocessing, vegetation pixel segmentation, plant extraction, feature extraction, feature‐based localization refinement, and crop plant classification. For the detection of broccoli and lettuce, the color‐depth fusion algorithm produced high true‐positive detection rates (91.7% and 90.8%, respectively) and low average false discovery rates (1.1% and 4.0%, respectively). Mean absolute localization errors of the crop plant stems were 26.8 and 7.4 mm for broccoli and lettuce, respectively. The fusion of color and depth was proved beneficial to the segmentation of crop plants from background, which improved the average segmentation success rates from 87.2% (depth‐based) and 76.4% (color‐based) to 96.6% for broccoli, and from 74.2% (depth‐based) and 81.2% (color‐based) to 92.4% for lettuce, respectively. The fusion‐based algorithm had reduced performance in detecting crop plants at early growth stages

    Crop recognition under weedy conditions based on 3D imaging for robotic weed control

    No full text
    A 3D time‐of‐flight camera was applied to develop a crop plant recognition system for broccoli and green bean plants under weedy conditions. The developed system overcame the previously unsolved problems caused by occluded canopy and illumination variation. An efficient noise filter was developed to remove the sparse noise points in 3D point cloud space. Both 2D and 3D features including the gradient of amplitude and depth image, surface curvature, amplitude percentile index, normal direction, and neighbor point count in 3D space were extracted and found effective for recognizing these two types of plants. Separate segmentation algorithms were developed for each of the broccoli and green bean plant in accordance with their 3D geometry and 2D amplitude characteristics. Under the experimental condition where the crops were heavily infested by various types of weed plants, detection rates over 88.3% and 91.2% were achieved for broccoli and green bean plant leaves, respectively. Additionally, the crop plants were segmented out with nearly complete shape. Moreover, the algorithms were computationally optimized, resulting in an image processing speed of over 30 frames per second.This is the peer-reviewed version of the following article: Li, Ji, and Lie Tang. "Crop recognition under weedy conditions based on 3D imaging for robotic weed control." Journal of Field Robotics 35, no. 4 (2018): 596-611, which has been published in final form at DOI: 10.1002/rob.21763. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.</p
    corecore