205 research outputs found

    Selecting patterns and features for between- and within- crop-row weed mapping using UAV-imagery

    Get PDF
    This paper approaches the problem of weed mapping for precision agriculture, using imagery provided by Unmanned Aerial Vehicles (UAVs) from sun ower and maize crops. Precision agriculture referred to weed control is mainly based on the design of early post-emergence site-speci c control treatments according to weed coverage, where one of the most important challenges is the spectral similarity of crop and weed pixels in early growth stages. Our work tackles this problem in the context of object-based image analysis (OBIA) by means of supervised machine learning methods combined with pattern and feature selection techniques, devising a strategy for alleviating the user intervention in the system while not compromising the accuracy. This work rstly proposes a method for choosing a set of training patterns via clustering techniques so as to consider a representative set of the whole eld data spectrum for the classi cation method. Furthermore, a feature selection method is used to obtain the best discriminating features from a set of several statistics and measures of di erent nature. Results from this research show that the proposed method for pattern selection is suitable and leads to the construction of robust sets of data. The exploitation of di erent statistical, spatial and texture metrics represents a new avenue with huge potential for between and within crop-row weed mapping via UAV-imagery and shows good synergy when complemented with OBIA. Finally, there are some measures (specially those linked to vegetation indexes) that are of great in uence for weed mapping in both sun ower and maize crop

    Monitorización 3D de cultivos y cartografía de malas hierbas mediante vehículos aéreos no tripulados para un uso sostenible de fitosanitarios

    Get PDF
    En esta Tesis Doctoral se han utilizado las imágenes procedentes de un UAV para abordar la sostenibilidad de la aplicación de productos fitosanitarios mediante la generación de mapas que permitan su aplicación localizada. Se han desarrollado dos formas diferentes y complementarias para lograr este objetivo: 1) la reducción de la aplicación de herbicidas en post-emergencia temprana mediante el diseño de tratamientos dirigidos a las zonas infestadas por malas hierbas en varios cultivos herbáceos; y 2) la caracterización tridimensional (arquitectura y volumen) de cultivos leñosos para el diseño de tratamientos de aplicación localizada de fitosanitarios dirigidos a la parte aérea de los mismos. Para afrontar el control localizado de herbicidas se han estudiado la configuración y las especificaciones técnicas de un UAV y de los sensores embarcados a bordo para su aplicación en la detección temprana de malas hierbas y contribuir a la generación de mapas para un control localizado en tres cultivos herbáceos: maíz, trigo y girasol. A continuación, se evaluaron los índices espectrales más precisos para su uso en la discriminación de suelo desnudo y vegetación (cultivo y malas hierbas) en imágenes-UAV tomadas sobre dichos cultivos en fase temprana. Con el fin de automatizar dicha discriminación se implementó en un entorno OBIA un método de cálculo de umbrales. Finalmente, se desarrolló una metodología OBIA automática y robusta para la discriminación de cultivo, suelo desnudo y malas hierbas en los tres cultivos estudiados, y se evaluó la influencia sobre su funcionamiento de distintos parámetros relacionados con la toma de imágenes UAV (solape, tipo de sensor, altitud de vuelo, momento de programación de los vuelos, entre otros). Por otra parte y para facilitar el diseño de tratamientos fitosanitarios ajustados a las necesidades de los cultivos leñosos se ha desarrollado una metodología OBIA automática y robusta para la caracterización tridimensional (arquitectura y volumen) de cultivos leñosos usando imágenes y modelos digitales de superficies generados a partir de imágenes procedentes de un UAV. Asimismo, se evaluó la influencia de distintos parámetros relacionados con la toma de las imágenes (solape, tipo de sensor, altitud de vuelo) sobre el funcionamiento del algoritmo OBIA diseñado

    Selecting patterns and features for between- and within-crop-row weed mapping using UAV-imagery

    Get PDF
    This paper approaches the problem of weed mapping for precision agriculture, using imagery provided by Unmanned Aerial Vehicles (UAVs) from sunflower and maize crops. Precision agriculture referred to weed control is mainly based on the design of early post-emergence site-specific control treatments according to weed coverage, where one of the most important challenges is the spectral similarity of crop and weed pixels in early growth stages. Our work tackles this problem in the context of object-based image analysis (OBIA) by means of supervised machine learning methods combined with pattern and feature selection techniques, devising a strategy for alleviating the user intervention in the system while not compromising the accuracy. This work firstly proposes a method for choosing a set of training patterns via clustering techniques so as to consider a representative set of the whole field data spectrum for the classification method. Furthermore, a feature selection method is used to obtain the best discriminating features from a set of several statistics and measures of different nature. Results from this research show that the proposed method for pattern selection is suitable and leads to the construction of robust sets of data. The exploitation of different statistical, spatial and texture metrics represents a new avenue with huge potential for between and within crop-row weed mapping via UAV-imagery and shows good synergy when complemented with OBIA. Finally, there are some measures (specially those linked to vegetation indexes) that are of great influence for weed mapping in both sunflower and maize crops

    Mid to Late Season Weed Detection in Soybean Production Fields Using Unmanned Aerial Vehicle and Machine Learning

    Get PDF
    Mid-late season weeds are those that escape the early season herbicide applications and those that emerge late in the season. They might not affect the crop yield, but if uncontrolled, will produce a large number of seeds causing problems in the subsequent years. In this study, high-resolution aerial imagery of mid-season weeds in soybean fields was captured using an unmanned aerial vehicle (UAV) and the performance of two different automated weed detection approaches – patch-based classification and object detection was studied for site-specific weed management. For the patch-based classification approach, several conventional machine learning models on Haralick texture features were compared with the Mobilenet v2 based convolutional neural network (CNN) model for their classification performance. The results showed that the CNN model had the best classification performance for individual patches. Two different image slicing approaches – patches with and without overlap were tested, and it was found that slicing with overlap leads to improved weed detection but with higher inference time. For the object detection approach, two models with different network architectures, namely Faster RCNN and SSD were evaluated and compared. It was found that Faster RCNN had better overall weed detection performance than the SSD with similar inference time. Also, it was found that Faster RCNN had better detection performance and shorter inference time compared to the patch-based CNN with overlapping image slicing. The influence of spatial resolution on weed detection accuracy was investigated by simulating the UAV imagery captured at different altitudes. It was found that Faster RCNN achieves similar performance at a lower spatial resolution. The inference time of Faster RCNN was evaluated using a regular laptop. The results showed the potential of on-farm near real-time weed detection in soybean production fields by capturing UAV imagery with lesser overlap and processing them with a pre-trained deep learning model, such as Faster RCNN, in regular laptops and mobile devices. Advisor: Yeyin Sh

    Mid to Late Season Weed Detection in Soybean Production Fields Using Unmanned Aerial Vehicle and Machine Learning

    Get PDF
    Mid-late season weeds are those that escape the early season herbicide applications and those that emerge late in the season. They might not affect the crop yield, but if uncontrolled, will produce a large number of seeds causing problems in the subsequent years. In this study, high-resolution aerial imagery of mid-season weeds in soybean fields was captured using an unmanned aerial vehicle (UAV) and the performance of two different automated weed detection approaches – patch-based classification and object detection was studied for site-specific weed management. For the patch-based classification approach, several conventional machine learning models on Haralick texture features were compared with the Mobilenet v2 based convolutional neural network (CNN) model for their classification performance. The results showed that the CNN model had the best classification performance for individual patches. Two different image slicing approaches – patches with and without overlap were tested, and it was found that slicing with overlap leads to improved weed detection but with higher inference time. For the object detection approach, two models with different network architectures, namely Faster RCNN and SSD were evaluated and compared. It was found that Faster RCNN had better overall weed detection performance than the SSD with similar inference time. Also, it was found that Faster RCNN had better detection performance and shorter inference time compared to the patch-based CNN with overlapping image slicing. The influence of spatial resolution on weed detection accuracy was investigated by simulating the UAV imagery captured at different altitudes. It was found that Faster RCNN achieves similar performance at a lower spatial resolution. The inference time of Faster RCNN was evaluated using a regular laptop. The results showed the potential of on-farm near real-time weed detection in soybean production fields by capturing UAV imagery with lesser overlap and processing them with a pre-trained deep learning model, such as Faster RCNN, in regular laptops and mobile devices. Advisor: Yeyin Sh

    Robots in Agriculture: State of Art and Practical Experiences

    Get PDF
    The presence of robots in agriculture has grown significantly in recent years, overcoming some of the challenges and complications of this field. This chapter aims to collect a complete and recent state of the art about the application of robots in agriculture. The work addresses this topic from two perspectives. On the one hand, it involves the disciplines that lead the automation of agriculture, such as precision agriculture and greenhouse farming, and collects the proposals for automatizing tasks like planting and harvesting, environmental monitoring and crop inspection and treatment. On the other hand, it compiles and analyses the robots that are proposed to accomplish these tasks: e.g. manipulators, ground vehicles and aerial robots. Additionally, the chapter reports with more detail some practical experiences about the application of robot teams to crop inspection and treatment in outdoor agriculture, as well as to environmental monitoring in greenhouse farming

    Site-Specific Weed Management Using Remote Sensing

    Get PDF

    Precision Weed Management Based on UAS Image Streams, Machine Learning, and PWM Sprayers

    Get PDF
    Weed populations in agricultural production fields are often scattered and unevenly distributed; however, herbicides are broadcast across fields evenly. Although effective, in the case of post-emergent herbicides, exceedingly more pesticides are used than necessary. A novel weed detection and control workflow was evaluated targeting Palmer amaranth in soybean (Glycine max) fields. High spatial resolution (0.4 cm) unmanned aircraft system (UAS) image streams were collected, annotated, and used to train 16 object detection convolutional neural networks (CNNs; RetinaNet, Faster R-CNN, Single Shot Detector, and YOLO v3) each trained on imagery with 0.4, 0.6, 0.8, and 1.2 cm spatial resolutions. Models were evaluated on imagery from four production fields containing approximately 7,800 weeds. The highest performing model was Faster R-CNN trained on 0.4 cm imagery (precision = 0.86, recall = 0.98, and F1-score = 0.91). A site-specific workflow leveraging the highest performing trained CNN models was evaluated in replicated field trials. Weed control (%) was compared between a broadcast treatment and the proposed site-specific workflow which was applied using a pulse-width modulated (PWM) sprayer. Results indicate no statistical (p \u3c .05) difference in weed control measured one (M = 96.22%, SD = 3.90 and M = 90.10%, SD = 9.96), two (M = 95.15%, SD = 5.34 and M = 89.64%, SD = 8.58), and three weeks (M = 88.55, SD = 11.07 and M = 81.78%, SD = 13.05) after application between broadcast and site-specific treatments, respectively. Furthermore, there was a significant (p \u3c 0.05) 48% mean reduction in applied area (m2) between broadcast and site-specific treatments across both years. Equivalent post application efficacy can be achieved with significant reductions in herbicides if weeds are targeted through site-specific applications. Site-specific weed maps can be generated and executed using accessible technologies like UAS, open-source CNNs, and PWM sprayers

    Precision Weed Management Based on UAS Image Streams, Machine Learning, and PWM Sprayers

    Get PDF
    Weed populations in agricultural production fields are often scattered and unevenly distributed; however, herbicides are broadcast across fields evenly. Although effective, in the case of post-emergent herbicides, exceedingly more pesticides are used than necessary. A novel weed detection and control workflow was evaluated targeting Palmer amaranth in soybean (Glycine max) fields. High spatial resolution (0.4 cm) unmanned aircraft system (UAS) image streams were collected, annotated, and used to train 16 object detection convolutional neural networks (CNNs; RetinaNet, Faster R-CNN, Single Shot Detector, and YOLO v3) each trained on imagery with 0.4, 0.6, 0.8, and 1.2 cm spatial resolutions. Models were evaluated on imagery from four production fields containing approximately 7,800 weeds. The highest performing model was Faster R-CNN trained on 0.4 cm imagery (precision = 0.86, recall = 0.98, and F1-score = 0.91). A site-specific workflow leveraging the highest performing trained CNN models was evaluated in replicated field trials. Weed control (%) was compared between a broadcast treatment and the proposed site-specific workflow which was applied using a pulse-width modulated (PWM) sprayer. Results indicate no statistical (p \u3c .05) difference in weed control measured one (M = 96.22%, SD = 3.90 and M = 90.10%, SD = 9.96), two (M = 95.15%, SD = 5.34 and M = 89.64%, SD = 8.58), and three weeks (M = 88.55, SD = 11.07 and M = 81.78%, SD = 13.05) after application between broadcast and site-specific treatments, respectively. Furthermore, there was a significant (p \u3c 0.05) 48% mean reduction in applied area (m2) between broadcast and site-specific treatments across both years. Equivalent post application efficacy can be achieved with significant reductions in herbicides if weeds are targeted through site-specific applications. Site-specific weed maps can be generated and executed using accessible technologies like UAS, open-source CNNs, and PWM sprayers
    corecore