10 research outputs found

    Evaluation of laser range-finder mapping for agricultural spraying vehicles

    Get PDF
    In this paper, we present a new application of laser range-finder sensing to agricultural spraying vehicles. The current generation of spraying vehicles use automatic controllers to maintain the height of the sprayer booms above the crop. However, these control systems are typically based on ultrasonic sensors mounted on the booms, which limits the accuracy of the measurements and the response of the controller to changes in the terrain, resulting in a sub-optimal spraying process. To overcome these limitations, we propose to use a laser scanner, attached to the front of the sprayer's cabin, to scan the ground surface in front of the vehicle and to build a scrolling 3d map of the terrain. We evaluate the proposed solution in a series of field tests, demonstrating that the approach provides a more detailed and accurate representation of the environment than the current sonar-based solution, and which can lead to the development of more efficient boom control systems

    Automatic detection of crop rows in maize fields with high weeds pressure

    Get PDF
    This paper proposes a new method, oriented to crop row detection in images from maize fields with high weed pressure. The vision system is designed to be installed onboard a mobile agricultural vehicle, i.e. submitted to gyros, vibrations and undesired movements. The images are captured under image perspective, being affected by the above undesired effects. The image processing consists of three main processes: image segmentation, double thresholding, based on the Otsu’s method, and crop row detection. Image segmentation is based on the application of a vegetation index, the double thresholding achieves the separation between weeds and crops and the crop row detection applies least squares linear regression for line adjustment. Crop and weed separation becomes effective and the crop row detection can be favorably compared against the classical approach based on the Hough transform. Both gain effectiveness and accuracy thanks to the double thresholding that makes the main finding of the paper

    3D Map and DGPS Validation for a Vineyard Autonomous Navigation System

    No full text
    An autonomous DGPS navigation system must use an accurate threedimensional (3D) digital map. However, it is crucial to validate it using data collected in the field. One possible way to validate the map is to employ a vehicle driven by an expert to ensure that the trajectory is plotted within the boundaries of navigation paths. It is essential to take this care, especially when the terrain is very highly uneven and small differences in position may correspond to large vertical deviations. A small navigation error can result in a serious fall, which may damage or even destroy the vehicle. In the Douro Demarcated Region, in northern Portugal, the vineyard is planted on narrow terraces built on steep hills along the winding Douro River. This paper presents the results of a dynamic trajectory survey obtained from a real navigation procedure, carried out by an expert driving an instrumented tractor during the spraying of the vineyard. The results were obtained using a DGPS (accuracy = 2 cm) and compared to an existing Digital Elevation Model (DEM) of the vineyard already created by the authors’ work group, with an average accuracy of 10 cm. The results are shown in a C# developed interface with OpenGL facilities, which enable the viewing of the 3D vineyard details. The results confirm the validation of the methodology previously adopted for map extraction and respective equipment selection. The trajectory of the tractor, including some maneuvers, is drawn within the inner and outer edges of each terrace or path that exists in the vineyard. The interface can also be used as an important tool in path planning to automatically extract the topology of the vineyard and to select the best path to carry out some vineyard tasks

    An Autonomous Guided Field Inspection Vehicle for 3D Woody Crops Monitoring

    No full text
    This paper presents a novel approach for crop monitoring and 3D reconstruction. A mobile platform, based on a commercial electric vehicle, was developed and equipped with different on-board sensors for crop monitoring. Acceleration, braking and steering systems of the vehicle were automatized. Fuzzy control systems were implemented to achieve autonomous navigation. A low-cost RGB-D sensor, Microsoft Kinect v2 sensor, and a reflex camera were installed on-board the platform for creation of 3D crop maps. The modelling of the field was fully automatic based on algorithms for 3D reconstructions of large areas, such as a complete row crop. Important information can be estimated from a 3D model of the crop, such as the canopy volume. For that goal, the alpha-shape algorithm was proposed. The on-going developments presented in this paper arise as a promising tool to achieve better crop management increasing crop profitability while reducing agrochemical inputs and environmental impact.This work was financed by the Spanish Ministerio de Economía y Competitividad (AGL2014-52465-C4-3-R) and the Spanish Agencia Estatal de Investigación (AEI) and Fondo Europeo de Desarrollo Regional (FEDER) (AGL2017-83325-C4-1-R and AGL2017-83325-C4-3-R). Karla Cantuña thanks the service commission for the remuneration given by the Cotopaxi Technical University. The authors also wish to acknowledge the ongoing technical support of Damián Rodríguez.Peer reviewe
    corecore