265 research outputs found

    Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion

    Get PDF
    Agricultural mobile robots have great potential to effectively implement different agricultural tasks. They can save human labour costs, avoid the need for people having to perform risky operations and increase productivity. Automation and advanced sensing technologies can provide up-to-date information that helps farmers in orchard management. Data collected from on-board sensors on a mobile robot provide information that can help the farmer detect tree or fruit diseases or damage, measure tree canopy volume and monitor fruit development. In orchards, trees are natural landmarks providing suitable cues for mobile robot localisation and navigation as trees are nominally planted in straight and parallel rows. This thesis presents a novel tree trunk detection algorithm that detects trees and discriminates between trees and non-tree objects in the orchard using a camera and 2D laser scanner data fusion. A local orchard map of the individual trees was developed allowing the mobile robot to navigate to a specific tree in the orchard to perform a specific task such as tree inspection. Furthermore, this thesis presents a localisation algorithm that does not rely on GPS positions and depends only on the on-board sensors of the mobile robot without adding any artificial landmarks, respective tapes or tags to the trees. The novel tree trunk detection algorithm combined the features extracted from a low cost camera's images and 2D laser scanner data to increase the robustness of the detection. The developed algorithm used a new method to detect the edge points and determine the width of the tree trunks and non-tree objects from the laser scan data. Then a projection of the edge points from the laser scanner coordinates to the image plane was implemented to construct a region of interest with the required features for tree trunk colour and edge detection. The camera images were used to verify the colour and the parallel edges of the tree trunks and non-tree objects. The algorithm automatically adjusted the colour detection parameters after each test which was shown to increase the detection accuracy. The orchard map was constructed based on tree trunk detection and consisted of the 2D positions of the individual trees and non-tree objects. The map of the individual trees was used as an a priority map for mobile robot localisation. A data fusion algorithm based on an Extended Kalman filter was used for pose estimation of the mobile robot in different paths (midway between rows, close to the rows and moving around trees in the row) and different turns (semi-circle and right angle turns) required for tree inspection tasks. The 2D positions of the individual trees were used in the correction step of the Extended Kalman filter to enhance localisation accuracy. Experimental tests were conducted in a simulated environment and a real orchard to evaluate the performance of the developed algorithms. The tree trunk detection algorithm was evaluated under two broad illumination conditions (sunny and cloudy). The algorithm was able to detect the tree trunks (regular and thin tree trunks) and discriminate between trees and non-tree objects with a detection accuracy of 97% showing that the fusion of both vision and 2D laser scanner technologies produced robust tree trunk detection. The mapping method successfully localised all the trees and non-tree objects of the tested tree rows in the orchard environment. The mapping results indicated that the constructed map can be reliably used for mobile robot localisation and navigation. The localisation algorithm was evaluated against the logged RTK-GPS positions for different paths and headland turns. The average of the RMS of the position error in x, y coordinates and Euclidean distance were 0.08 m, 0.07 m and 0.103 m respectively, whilst the average of the RMS of the heading error was 3:32°. These results were considered acceptable while driving along the rows and when executing headland turns for the target application of autonomous mobile robot navigation and tree inspection tasks in orchards

    A LiDAR-Inertial SLAM Tightly-Coupled with Dropout-Tolerant GNSS Fusion for Autonomous Mine Service Vehicles

    Full text link
    Multi-modal sensor integration has become a crucial prerequisite for the real-world navigation systems. Recent studies have reported successful deployment of such system in many fields. However, it is still challenging for navigation tasks in mine scenes due to satellite signal dropouts, degraded perception, and observation degeneracy. To solve this problem, we propose a LiDAR-inertial odometry method in this paper, utilizing both Kalman filter and graph optimization. The front-end consists of multiple parallel running LiDAR-inertial odometries, where the laser points, IMU, and wheel odometer information are tightly fused in an error-state Kalman filter. Instead of the commonly used feature points, we employ surface elements for registration. The back-end construct a pose graph and jointly optimize the pose estimation results from inertial, LiDAR odometry, and global navigation satellite system (GNSS). Since the vehicle has a long operation time inside the tunnel, the largely accumulated drift may be not fully by the GNSS measurements. We hereby leverage a loop closure based re-initialization process to achieve full alignment. In addition, the system robustness is improved through handling data loss, stream consistency, and estimation error. The experimental results show that our system has a good tolerance to the long-period degeneracy with the cooperation different LiDARs and surfel registration, achieving meter-level accuracy even for tens of minutes running during GNSS dropouts

    Real-time performance-focused on localisation techniques for autonomous vehicle: a review

    Get PDF

    Characterization of a mobile mapping system for seamless navigation

    Get PDF
    4noMobile Mapping Systems (MMS) are multi-sensor technologies based on SLAM procedure, which provides accurate 3D measurement and mapping of the environment as also trajectory estimation for autonomous navigation. The major limits of these algorithms are the navigation and mapping inconsistence over the time and the georeferencing of the products. These issues are particularly relevant for pose estimation regardless the environment like in seamless navigation. This paper is a preliminary analysis on a proposed multi-sensor platform integrated for indoor/outdoor seamless positioning system. In particular the work is devoted to analyze the performances of the MMS in term of positioning accuracy and to evaluate its improvement with the integration of GNSS and UWB technology. The results show that, if the GNSS and UWB signal are not degraded, using the correct weight to their observations in the Stencil estimation algorithm, is possible to obtain an improvement in the accuracy of the MMS navigation solution as also in the global consistency of the final point cloud. This improvement is measured in about 7 cm for planimetric coordinate and 34 cm along the elevation with respect to the use of the Stencil system alone.openopenDI Pietra V.; Grasso N.; Piras M.; Dabove P.DI Pietra, V.; Grasso, N.; Piras, M.; Dabove, P

    CHARACTERIZATION OF A MOBILE MAPPING SYSTEM FOR SEAMLESS NAVIGATION

    Get PDF
    Abstract. Mobile Mapping Systems (MMS) are multi-sensor technologies based on SLAM procedure, which provides accurate 3D measurement and mapping of the environment as also trajectory estimation for autonomous navigation. The major limits of these algorithms are the navigation and mapping inconsistence over the time and the georeferencing of the products. These issues are particularly relevant for pose estimation regardless the environment like in seamless navigation. This paper is a preliminary analysis on a proposed multi-sensor platform integrated for indoor/outdoor seamless positioning system. In particular the work is devoted to analyze the performances of the MMS in term of positioning accuracy and to evaluate its improvement with the integration of GNSS and UWB technology. The results show that, if the GNSS and UWB signal are not degraded, using the correct weight to their observations in the Stencil estimation algorithm, is possible to obtain an improvement in the accuracy of the MMS navigation solution as also in the global consistency of the final point cloud. This improvement is measured in about 7 cm for planimetric coordinate and 34 cm along the elevation with respect to the use of the Stencil system alone

    Proceedings of the 4th field robot event 2006, Stuttgart/Hohenheim, Germany, 23-24th June 2006

    Get PDF
    Zeer uitgebreid verslag van het 4e Fieldrobotevent, dat gehouden werd op 23 en 24 juni 2006 in Stuttgart/Hohenhei

    SEAL: Simultaneous Exploration and Localization in Multi-Robot Systems

    Full text link
    The availability of accurate localization is critical for multi-robot exploration strategies; noisy or inconsistent localization causes failure in meeting exploration objectives. We aim to achieve high localization accuracy with contemporary exploration map belief and vice versa without needing global localization information. This paper proposes a novel simultaneous exploration and localization (SEAL) approach, which uses Gaussian Processes (GP)-based information fusion for maximum exploration while performing communication graph optimization for relative localization. Both these cross-dependent objectives were integrated through the Rao-Blackwellization technique. Distributed linearized convex hull optimization is used to select the next-best unexplored region for distributed exploration. SEAL outperformed cutting-edge methods on exploration and localization performance in extensive ROS-Gazebo simulations, illustrating the practicality of the approach in real-world applications.Comment: Accepted to IROS 202

    Multi-sensor guidance of an agricultural robot

    Get PDF
    The increasing demand for high-density soil data, and the high labor cost associated with manual methods, have encouraged the development of autonomous alternatives. In this study, a mobile robot named ‘AgTracker’ was developed as a platform for an autonomous soil sampling machine. The robot, equipped with a low-accuracy GPS, a LIDAR scanner, an electronic compass visited human-defined locations through its auto-navigation system. This system also had a user-friendly interface, which enabled operators to set waypoints by clicking on Google Maps®. Locations could also be remotely monitored in real time through this interface. An Xbee wireless network was built to make the remote waypoints set up and monitoring possible. The robot was tested on campus of University of Illinois at Urbana-Champaign. It could visit waypoints one by one successfully in most cases. The robot’s localization errors, which were the distances between its true visited locations and set waypoints, were evaluated. An average error within 0.2 m was achieved
    • …
    corecore