24 research outputs found

    Trajectory tracking in the presence of obstacles using the limit cycle navigation method

    Get PDF
    This paper proposes a system for effecting trajectory tracking in combination with obstacle avoidance in mobile robotic systems. In robotics research, these two situations are typically considered as separate problems. This work approaches the problem by integrating classical trajectory following control schemes with Kim et al.’s Limit Cycle Navigation method for obstacle avoidance. The use of Artificial Potential Function methods for obstacle avoidance is purposely avoided so as to prevent the well-known problems of local minima associated with such schemes. The paper also addresses the problem of non-global obstacle sensing and proposes modifications to Kim et al.’s method for handling multiple, overlapping obstacles under local sensing conditions.peer-reviewe

    Trajectory tracking of a differentially driven wheeled mobile robot in the presence of obstacles

    Get PDF
    A trajectory following and obstacle avoidance mechanism for a mobile robot is presented for situations where the robot has to follow a specific target trajectory but the task might not be completely possible due to obstacles in the way, which the robot must avoid. After avoiding an obstacle, the robot should catch up with the target trajectory. In the proposed system, this objective is reached by combining a nonlinear control method with an Artificial Potential Function method, leading to trajectory tracking control with obstacle avoidance capabilities.peer-reviewe

    Multipath detection from GNSS observables using gated recurrent unit

    Get PDF
    One of the most used Position, Navigation, and Timing (PNT) technology of the 21st century is Global Navigation Satellite Systems (GNSS). GNSS signals are affected by urban canyons that limit Line-Of-Sight (LOS) and increase position ambiguity. Smart cities are expected to adopt autonomous Unmanned Aerial Vehicles (UAV) operations for critical missions such as the transportation of organs that are time-sensitive. Therefore, techniques to mitigate Non-Line-Of-Sight (NLOS) interference are required for improved positioning accuracy. This paper proposes a Gated Recurrent Unit-based (GRU) multipath detection algorithm that uses pseudorange, ephemerides, Doppler shift, Carrier-To-Noise Ratio (C/N0), and elevation data from each satellite to determine whether multipath is present. Signals from the satellite classified as multipath are then flagged and ignored for Position, Velocity, and Timing (PVT) calculations until they are deemed as LOS. The classification algorithm is developed and tested on Spirent GSS7000 to generate GNSS Radio Frequency (RF). OKTAL-SE Sim3D is used to simulate urban canyon environments in which signals propagate from the satellite to the receiver. RF signals are then transmitted to a Ublox F9P GNSS receiver that can receive GPS and GLONASS signals which are processed to output PVT information. The data collected is used to train the GRU to classify received signals as no multipath or multipath. From performance evaluation, GRU outperforms decision tree, K-Nearest Neighbor (KNN) classifiers, and Support Vector Machines (SVM). Furthermore, comparing GRU with SVM, a 50% increase in accuracy is observed with a 95% error of 0.85 m for GRU compared to 1.78 m for SVM

    An INS/GNSS fusion architecture in GNSS denied environment using gated recurrent unit

    Get PDF
    One of the most used Position, Navigation and Timing (PNT) technology of the 21st century is Global Navigation Satellite Systems (GNSS). GNSS signals are affected by urban canyons that limit line-of-sight and reduce satellite availability to receivers. Smart cities are expected to adopt autonomous Unmanned Aerial Vehicles (UAV) operations for critical missions such as transportation of organs which are time-sensitive. Therefore, higher accuracy for position and velocity information is required. This paper investigates the use of Gated Recurrent Units (GRU) as a suitable technique that can memorize previous information in conjunction with the inputs (consisting of attitude, change in attitude, and change in velocity) to reduce position and velocity error when GNSS is not available. The fusion approach is developed and tested using Spirent’s SimGEN GSS7000 hardware simulator which simulates GNSS signals and Spirent’s SimSENSOR software that simulates accelerometer and gyroscope stochastic and deterministic errors. GNSS outage is varied between 1 and 20 seconds randomly to affect predicted position and velocity. The data is collected and used to train the GRU to predict the position and velocity error measured by the Inertial Measurement Unit (IMU). From the performance evaluation, a 60% reduction in Root Mean Squared Error (RMSE) is observed compared to Recurrent Neural Networks (RNN). Comparing 95th percentile with Inertial Navigation System (INS), RNN, and GRU, an 80% reduction is observed between INS and RNN. Furthermore, a 35% drop in the 95th percentile is observed between RNN and GRU

    A hybrid deep learning approach for robust multi-sensor GNSS/INS/VO fusion in urban canyons

    Get PDF
    This paper addresses the significant challenges of robust autonomous navigation in Unmanned Aerial Vehicles (UAVs) within densely populated environments. The focus is on enhancing the performance of Position, Navigation, and Timing (PNT), as specified by the International Civil Aviation Organization, in terms of accuracy, integrity, continuity, and availability. The novel contribution introduces a Robust Multi-Sensor Fusion Architecture (RMSFA) that utilizes a Bayesian-LSTM machine learning algorithm, fusing GNSS, INS, and monocular odometry. Unlike existing solutions that rely on sensor redundancies or methods such as Receiver Autonomous Integrity Monitoring (RAIM), which have limitations in performance, or adaptability to erroneous signals, the proposed system offers improvements in both positioning accuracy and integrity. Furthermore, GNSS data is preprocessed to remove NoneLine-of-Sight data (NLOS) to improve positioning accuracy. Additionally, INS data errors are corrected using a GRU-based error correction architecture to improve INS positioning and reduce drifting. The addition of these post-processing steps reduced the 95th percentile horizontal error by 97.4% and 71.5% respectively. A CNN-LSTM architecture is used to obtain a Visual Odometer (VO) from the camera sensor. The Bayesian-LSTM architecture fusion performance was then compared to a GNSS/IMU/VO EKF-GRU architecture. The comparison showed a 95th percentile error improvement in the horizontal direction of 30.1% for the BayesianLSTM. The architecture was tested in a realistic simulated environment utilizing Unreal Engine and AirSim for UAV simulation, Spirent GNSS7000 simulator for Hardware-in-the-Loop (HIL) simulation, and OKTAL-SE Sim3D to mimic the effects of multipath on GNSS signals. Overall, this work represents a step toward improving the safety and effectiveness of drone navigation by providing a more robust navigation system suitable for safety-critical situations, without the stated disadvantages in previously mentioned literatures

    Localisation-safe reinforcement learning for mapless navigation

    Get PDF
    Most reinforcement learning (RL)-based works for mapless point goal navigation tasks assume the availability of the robot ground-truth poses, which is unrealistic for real world applications. In this work, we remove such an assumption and deploy observation-based localisation algorithms, such as Lidar-based or visual odometry, for robot self-pose estimation. These algorithms, despite having widely achieved promising performance and being robust to various harsh environments, may fail to track robot locations under many scenarios, where observations perceived along robot trajectories are insufficient or ambiguous. Hence, using such localisation algorithms will introduce new unstudied problems for mapless navigation tasks. This work will propose a new RL-based algorithm, with which robots learn to navigate in a way that prevents localisation failures or getting trapped in local minimum regions. This ability can be learned by deploying two techniques suggested in this work: a reward metric to decide punishment on behaviours resulting in localisation failures; and a reconfigured state representation that consists of current observation and history trajectory information to transfer the problem from a partially observable Markov decision process (POMDP) to a Markov Decision Process (MDP) model to avoid local minimum

    Uncertainty-based sensor fusion architecture using Bayesian-LSTM neural network

    Get PDF
    Uncertainty-based sensor management for positioning is an essential component in safe drone operations inside urban environments with large urban valleys. These canyons significantly restrict the Line-Of-Sight signal conditions required for accurate positioning using Global Navigation Satellite Systems (GNSS). Therefore, sensor fusion solutions need to be in place whichcan take advantage of alternative Positioning, Navigation and Timing (PNT) sensors such as accelerometers or gyroscopes to complement GNSS information. Recent stateof-art research has focused on Machine Learning (ML) techniques such as Support Vector Machines (SVM) that utilize statistical learning to provide an output for a given input. However, understanding the uncertainty of these predictions made by Deep Learning (DL) models can help improve integrity of fusion systems. Therefore, there is a need for a DL model that can also provide uncertainty-related information as the output. This paper proposes a Bayesian-LSTM Neural Network (BLSTMNN) that is used to fuse GNSS and Inertial Measurement Unit (IMU) data. Furthermore, Protection Level (PL) is estimated based on the uncertainty distribution given by the system. To test the algorithm, Hardware-In-the-Loop (HIL) simulationhas been performed,utilizingSpirent’s GSS7000 simulator and OKTAL-SE Sim3D to simulate GNSS propagation in urban canyons. SimSENSOR is used to simulate the accelerometer and gyroscope. Results show that Bayesian-LSTM provides the best fusion performance compared to GNSS alone, and GNSS/IMU fusion using EKF and SVM. Furthermore, regarding uncertainty estimates, the proposed algorithm can estimate the positioning boundaries correctly, with an error rate of 0.4% and with an accuracy of 99.6%

    GNSS/INS/VO fusion using gated recurrent unit in GNSS denied environments

    Get PDF
    Urban air mobility is a growing market, which will bring new ways to travel and to deliver items covering urban and suburban areas, at relatively low altitudes. To guarantee a safe and robust navigation, Unmanned Aerial Vehicles should be able to overcome all the navigational constraints. The paper is analyzing a novel sensor fusion framework with the aim to obtain a stable flight in a degraded GNSS environment. The sensor fusion framework is combining data coming from a GNSS receiver, an IMU and an optical camera under a loosely coupled scheme. A Federated Filter approach is implemented with the integration of two GRUs blocks. The first GRU is used to increase the accuracy in time of the INS, giving as output a more reliable position that it is fused, with the position information coming from, the GNSS receiver, and the developed Visual Odometry algorithm. Further, a master GRU block is used to select the best position information. The data is collected using a hardware in the loop setup, using AirSim, Pixhawk and Spirent GSS7000 hardware. As validation, the framework is tested, on a virtual UAV, performing a delivery mission on Cranfield university campus. Results showed that the developed fusion framework, can be used for short GNSS outages
    corecore