192 research outputs found

    A loose-coupled fusion of inertial and UWB assisted by a decision-making algorithm for localization of emergency responders

    Get PDF
    Combining different technologies is gaining significant popularity among researchers and industry for the development of indoor positioning systems (IPSs). These hybrid IPSs emerge as a robust solution for indoor localization as the drawbacks of each technology can be mitigated or even eliminated by using complementary technologies. However, fusing position estimates from different technologies is still very challenging and, therefore, a hot research topic. In this work, we pose fusing the ultrawideband (UWB) position estimates with the estimates provided by a pedestrian dead reckoning (PDR) by using a Kalman filter. To improve the IPS accuracy, a decision-making algorithm was developed that aims to assess the usability of UWB measurements based on the identification of non-line-of-sight (NLOS) conditions. Three different data fusion algorithms are tested, based on three different time-of-arrival positioning algorithms, and experimental results show a localization accuracy of below 1.5 m for a 99th percentile.This work has been partially supported by FCT – Fundação para a Ciência e Tecnologia within the Project Scope: UID/CEC/00319/2019 and Project UID/CTM/00264/2019 of 2C2T - Centro de Ciência e Tecnologia Têxtil, funded by National Founds through FCT/MCTES. The work of A. G. Ferreira and D. Fernandes was supported by the FCT under Grant SFRH/BD/91477/2012 and Grant SFRH/BD/92082/2012

    All Source Sensor Integration Using an Extended Kalman Filter

    Get PDF
    The global positioning system (GPS) has become an ubiquitous source for navigation in the modern age, especially since the removal of selective availability at the beginning of this century. The utility of the GPS is unmatched, however GPS is not available in all environments. Heavy reliance on GPS for navigation makes the warfighter increasingly vulnerability as modern warfare continues to evolve. This research provides a method for incorporating measurements from a massive variety of sensors to mitigate GPS dependence. The result is the integration of sensor sets that encompass those examined in recent literature as well as some custom navigation devices. A full-state extended Kalman filter is developed and implemented, accommodating the requirements of the varied sensor sets and scenarios. Some 19 types of sensors are used in multiple quantities including inertial measurement units, single cameras and stereo pairs, 2D and 3D laser scanners, altimeters, 3-axis magnetometers, heading sensors, inclinometers, a stop sign sensor, an odometer, a step sensor, a ranging device, a signal of opportunity sensor, global navigation satellite system sensors, an air data computer, and radio frequency identification devices. Simulation data for all sensors was generated to test filter performance. Additionally, real data was collected and processed from an aircraft, ground vehicles, and a pedestrian. Measurement equations are developed to relate sensor measurements to the navigation states. Each sensor measurement is incorporated into the filter using the Kalman filter measurement update equations. Measurement types are segregated based on whether they observe instantaneous or accumulated state information. Accumulated state measurements are incorporated using delayed-state update equations. All other measurements are incorporated using the numerically robust UD update equations

    Influence of complex environments on LiDAR-Based robot navigation

    Get PDF
    La navigation sécuritaire et efficace des robots mobiles repose grandement sur l’utilisation des capteurs embarqués. L’un des capteurs qui est de plus en plus utilisé pour cette tâche est le Light Detection And Ranging (LiDAR). Bien que les recherches récentes montrent une amélioration des performances de navigation basée sur les LiDARs, faire face à des environnements non structurés complexes ou des conditions météorologiques difficiles reste problématique. Dans ce mémoire, nous présentons une analyse de l’influence de telles conditions sur la navigation basée sur les LiDARs. Notre première contribution est d’évaluer comment les LiDARs sont affectés par les flocons de neige durant les tempêtes de neige. Pour ce faire, nous créons un nouvel ensemble de données en faisant l’acquisition de données durant six précipitations de neige. Une analyse statistique de ces ensembles de données, nous caractérisons la sensibilité de chaque capteur et montrons que les mesures de capteurs peuvent être modélisées de manière probabilistique. Nous montrons aussi que les précipitations de neige ont peu d’influence au-delà de 10 m. Notre seconde contribution est d’évaluer l’impact de structures tridimensionnelles complexes présentes en forêt sur les performances d’un algorithme de reconnaissance d’endroits. Nous avons acquis des données dans un environnement extérieur structuré et en forêt, ce qui permet d’évaluer l’influence de ces derniers sur les performances de reconnaissance d’endroits. Notre hypothèse est que, plus deux balayages laser sont proches l’un de l’autre, plus la croyance que ceux-ci proviennent du même endroit sera élevée, mais modulé par le niveau de complexité de l’environnement. Nos expériences confirment que la forêt, avec ses réseaux de branches compliqués et son feuillage, produit plus de données aberrantes et induit une chute plus rapide des performances de reconnaissance en fonction de la distance. Notre conclusion finale est que, les environnements complexes étudiés influencent négativement les performances de navigation basée sur les LiDARs, ce qui devrait être considéré pour développer des algorithmes de navigation robustes.To ensure safe and efficient navigation, mobile robots heavily rely on their ability to use on-board sensors. One such sensor, increasingly used for robot navigation, is the Light Detection And Ranging (LiDAR). Although recent research showed improvement in LiDAR-based navigation, dealing with complex unstructured environments or difficult weather conditions remains problematic. In this thesis, we present an analysis of the influence of such challenging conditions on LiDAR-based navigation. Our first contribution is to evaluate how LiDARs are affected by snowflakes during snowstorms. To this end, we create a novel dataset by acquiring data during six snowfalls using four sensors simultaneously. Based on statistical analysis of this dataset, we characterized the sensitivity of each device and showed that sensor measurements can be modelled in a probabilistic manner. We also showed that falling snow has little impact beyond a range of 10 m. Our second contribution is to evaluate the impact of complex of three-dimensional structures, present in forests, on the performance of a LiDAR-based place recognition algorithm. We acquired data in structured outdoor environment and in forest, which allowed evaluating the impact of the environment on the place recognition performance. Our hypothesis was that the closer two scans are acquired from each other, the higher the belief that the scans originate from the same place will be, but modulated by the level of complexity of the environments. Our experiments confirmed that forests, with their intricate network of branches and foliage, produce more outliers and induce recognition performance to decrease more quickly with distance when compared with structured outdoor environment. Our conclusion is that falling snow conditions and forest environments negatively impact LiDAR-based navigation performance, which should be considered to develop robust navigation algorithms

    Compact Environment Modelling from Unconstrained Camera Platforms

    Get PDF
    Mobile robotic systems need to perceive their surroundings in order to act independently. In this work a perception framework is developed which interprets the data of a binocular camera in order to transform it into a compact, expressive model of the environment. This model enables a mobile system to move in a targeted way and interact with its surroundings. It is shown how the developed methods also provide a solid basis for technical assistive aids for visually impaired people

    Challenges and solutions for autonomous ground robot scene understanding and navigation in unstructured outdoor environments: A review

    Get PDF
    The capabilities of autonomous mobile robotic systems have been steadily improving due to recent advancements in computer science, engineering, and related disciplines such as cognitive science. In controlled environments, robots have achieved relatively high levels of autonomy. In more unstructured environments, however, the development of fully autonomous mobile robots remains challenging due to the complexity of understanding these environments. Many autonomous mobile robots use classical, learning-based or hybrid approaches for navigation. More recent learning-based methods may replace the complete navigation pipeline or selected stages of the classical approach. For effective deployment, autonomous robots must understand their external environments at a sophisticated level according to their intended applications. Therefore, in addition to robot perception, scene analysis and higher-level scene understanding (e.g., traversable/non-traversable, rough or smooth terrain, etc.) are required for autonomous robot navigation in unstructured outdoor environments. This paper provides a comprehensive review and critical analysis of these methods in the context of their applications to the problems of robot perception and scene understanding in unstructured environments and the related problems of localisation, environment mapping and path planning. State-of-the-art sensor fusion methods and multimodal scene understanding approaches are also discussed and evaluated within this context. The paper concludes with an in-depth discussion regarding the current state of the autonomous ground robot navigation challenge in unstructured outdoor environments and the most promising future research directions to overcome these challenges

    Effficient Graph-based Computation and Analytics

    Get PDF
    With data explosion in many domains, such as social media, big code repository, Internet of Things (IoT), and inertial sensors, only 32% of data available to academic and industry is put to work, and the remaining 68% goes unleveraged. Moreover, people are facing an increasing number of obstacles concerning complex analytics on the sheer size of data, which include 1) how to perform dynamic graph analytics in a parallel and robust manner within a reasonable time? 2) How to conduct performance optimizations on a property graph representing and consisting of the semantics of code, data, and runtime systems for big data applications? 3) How to innovate neural graph approaches (ie, Transformer) to solve realistic research problems, such as automated program repair and inertial navigation? To tackle these problems, I present two efforts along this road: efficient graph-based computation and intelligent graph analytics. Specifically, I firstly propose two theory-based dynamic graph models to characterize temporal trends in large social media networks, then implement and optimize them atop Apache Spark GraphX to improve their performances. In addition, I investigate a semantics-aware optimization framework consisting of offline static analysis and online dynamic analysis on a property graph representing the skeleton of a data-intensive application, to interactively and semi-automatically assist programmers to scrutinize the performance problems camouflaged in the source code. In the design of intelligent graph-based algorithms, I innovate novel neural graph-based approaches with multi-task learning techniques to repair a broad range of programming bugs automatically, and also improve the accuracy of pedestrian navigation systems in only consideration of sensor data of Inertial Measurement Units (IMU, ie accelerometer, gyroscope, and magnetometer). In this dissertation, I elaborate on the definitions of these research problems and leverage the knowledge of graph computation, program analysis, and deep learning techniques to seek solutions to them, followed by comprehensive comparisons with the state-of-the-art baselines and discussions on future research

    A unified vision and inertial navigation system for planetary hoppers

    Get PDF
    Thesis: S.M., Massachusetts Institute of Technology, Department of Aeronautics and Astronautics, 2012.Cataloged from PDF version of thesis.Includes bibliographical references (pages 139-146).In recent years, considerable attention has been paid to hopping as a novel mode of planetary exploration. Hopping vehicles provide advantages over traditional surface exploration vehicles, such as wheeled rovers, by enabling in-situ measurements in otherwise inaccessible terrain. However, significant development over previously demonstrated vehicle navigation technologies is required to overcome the inherent challenges involved in navigating a hopping vehicle, especially in adverse terrain. While hoppers are in many ways similar to traditional landers and surface explorers, they incorporate additional, unique motions that must be accounted for beyond those of conventional planetary landing and surface navigation systems. This thesis describes a unified vision and inertial navigation system for propulsive planetary hoppers and provides demonstration of this technology. An architecture for a navigation system specific to the motions and mission profiles of hoppers is presented, incorporating unified inertial and terrain-relative navigation solutions. A modular sensor testbed, including a stereo vision package and inertial measurement unit, was developed to act as a proof-of-concept for this navigation system architecture. The system is shown to be capable of real-time output of an accurate navigation state estimate for motions and trajectories similar to those of planetary hoppers.by Theodore J. Steiner, III.S.M

    Indoor Localisation of Scooters from Ubiquitous Cost-Effective Sensors: Combining Wi-Fi, Smartphone and Wheel Encoders

    Get PDF
    Indoor localisation of people and objects has been a focus of research studies for several decades because of its great advantage to several applications. Accuracy has always been a challenge because of the uncertainty of the employed sensors. Several technologies have been proposed and researched, however, accuracy still represents an issue. Today, several sensor technologies can be found in indoor environments, some of which are economical and powerful, such as Wi-Fi. Meanwhile, Smartphones are typically present indoors because of the people that carry them along, while moving about within rooms and buildings. Furthermore, vehicles such as mobility scooters can also be present indoor to support people with mobility impairments, which may be equipped with low-cost sensors, such as wheel encoders. This thesis investigates the localisation of mobility scooters operating indoor. This represents a specific topic as most of today's indoor localisation systems are for pedestrians. Furthermore, accurate indoor localisation of those scooters is challenging because of the type of motion and specific behaviour. The thesis focuses on improving localisation accuracy for mobility scooters and on the use of already available indoor sensors. It proposes a combined use of Wi-Fi, Smartphone IMU and wheel encoders, which represents a cost-effective energy-efficient solution. A method has been devised and a system has been developed, which has been experimented on different environment settings. The outcome of the experiments are presented and carefully analysed in the thesis. The outcome of several trials demonstrates the potential of the proposed solutions in reducing positional errors significantly when compared to the state-of-the-art in the same area. The proposed combination demonstrated an error range of 0.35m - 1.35m, which can be acceptable in several applications, such as some related to assisted living. 3 As the proposed system capitalizes on the use of ubiquitous technologies, it opens up to a potential quick take up from the market, therefore being of great benefit for the target audience

    Novel Methods for Personal Indoor Positioning

    Get PDF
    Currently, people are used to getting accurate GNSS based positioning services. However, in indoor environments, the GNSS cannot provide the accuracy and availability comparable to open outdoor environments. Therefore, alternatives to GNSS are needed for indoor positioning. In this thesis, methods for pedestrian indoor positioning are proposed. With these novel methods, the mobile unit performs all the required positioning measurements and no dedicated positioning infrastructure is required.This thesis proposes novel radio map configuration methods for WLAN fingerprinting based on received signal strength measurements. These methods with different model parameters were studied in field tests to identify the best models with reasonable positioning accuracy and moderate memory requirements. A histogram based WLAN fingerprinting model is proposed to aid IMU based pedestrian dead reckoning that is obtained using a gyro and a 3-axis accelerometer, both based on MEMS technology. The sensor data is used to detect the steps taken by a person on foot and to estimate the step length and the heading change during each step.For the aiding of the PDR with WLAN positioning, this thesis proposes two different configurations of complementary extended Kalman filters. The field tests show that these configurations produce equivalent position estimates. Two particle filters are proposed to implement the map aided PDR: one filter uses only the PDR and map information, while the other uses also the WLAN positioning. Based on the field tests, map aiding improves the positioning accuracy more than WLAN positioning.Novel map checking algorithms based on the sequential re-selection of obstacle lines are proposed to decrease the computation time required by the indoor map matching. To present the map information, both unstructured and structured obstacle maps are used. The feasibility of the proposed particle filter algorithms to real time navigation were demonstrated in field tests
    corecore