367 research outputs found

    Map matching by using inertial sensors: literature review

    Get PDF
    This literature review aims to clarify what is known about map matching by using inertial sensors and what are the requirements for map matching, inertial sensors, placement and possible complementary position technology. The target is to develop a wearable location system that can position itself within a complex construction environment automatically with the aid of an accurate building model. The wearable location system should work on a tablet computer which is running an augmented reality (AR) solution and is capable of track and visualize 3D-CAD models in real environment. The wearable location system is needed to support the system in initialization of the accurate camera pose calculation and automatically finding the right location in the 3D-CAD model. One type of sensor which does seem applicable to people tracking is inertial measurement unit (IMU). The IMU sensors in aerospace applications, based on laser based gyroscopes, are big but provide a very accurate position estimation with a limited drift. Small and light units such as those based on Micro-Electro-Mechanical (MEMS) sensors are becoming very popular, but they have a significant bias and therefore suffer from large drifts and require method for calibration like map matching. The system requires very little fixed infrastructure, the monetary cost is proportional to the number of users, rather than to the coverage area as is the case for traditional absolute indoor location systems.Siirretty Doriast

    Map Matching by Using Inertial Sensors – Literature Review

    Get PDF
    This literature review aims to clarify what is known about map matching by using inertial sensors and what are the requirements for map matching, inertial sensors, placement and possible complementary position technology. The target is to develop a wearable location system that can position itself within a complex construction environment automatically with the aid of an accurate building model. The wearable location system should work on a tablet computer which is running an augmented reality (AR) solution and is capable of track and visualize 3D-CAD models in real environment. The wearable location system is needed to support the system in initialization of the accurate camera pose calculation and automatically finding the right location in the 3D-CAD model. One type of sensor which does seem applicable to people tracking is inertial measurement unit (IMU). The IMU sensors in aerospace applications, based on laser based gyroscopes, are big but provide a very accurate position estimation with a limited drift. Small and light units such as those based on Micro-Electro-Mechanical (MEMS) sensors are becoming very popular, but they have a signicant bias and therefore suffer from large drifts and require method for calibration like map matching. The system requires very little fixed infrastructure, the monetary cost is proportional to the number of users, rather than to the coverage area as is the case for traditional absolute indoor location systems.</p

    Filtering and Tracking for Pedestrian Dead-Reckoning System.

    Full text link
    This thesis proposes a leader-follower system in which a robot, equipped with relatively sophisticated sensors, tracks and follows a human whose equipped with a low-fidelity odometry sensor called a Pedestrian Dead-Reckoning (PDR) device. Such a system is useful for "pack mule" applications, where the robot carries heavy loads for the humans. The proposed system is not dependent upon GPS, which can be jammed or obstructed. This human-following capability is made possible due to several novel contributions. First, we perform an in-depth analysis of our Pedestrian Dead-Reckoning (PDR) system with the Unscented Kalman Filter (UKF) and models of varying complexity. We propose an extension that limits elevation errors, and show that our proposed method reduces errors by 63% compared to a baseline method. We also propose a method for integrating magnetometers into the PDR framework, which automatically and opportunistically calibrates for hard/soft-iron effects and sensor misalignments. In a series of large-scale experiments, we show that this system achieves positional errors of less than 1.9% of the distance traveled. Finally, we propose methods that allow a robot to use LIDAR data to improve the accuracy of the robot's estimate of the human’s trajectory. These methods include: 1) a particle filter method and 2) two multi-hypothesis maximum-likelihood approaches based on stochastic gradient descent optimization. We show that the proposed approaches are able to track human trajectories in several synthetic and real-world datasets.PHDMechanical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/113500/1/suratkw_1.pd

    Communication-based UAV Swarm Missions

    Get PDF
    Unmanned aerial vehicles have developed rapidly in recent years due to technological advances. UAV technology can be applied to a wide range of applications in surveillance, rescue, agriculture and transport. The problems that can exist in these areas can be mitigated by combining clusters of drones with several technologies. For example, when a swarm of drones is under attack, it may not be able to obtain the position feedback provided by the Global Positioning System (GPS). This poses a new challenge for the UAV swarm to fulfill a specific mission. This thesis intends to use as few sensors as possible on the UAVs and to design the smallest possible information transfer between the UAVs to maintain the shape of the UAV formation in flight and to follow a predetermined trajectory. This thesis presents Extended Kalman Filter methods to navigate autonomously in a GPS-denied environment. The UAV formation control and distributed communication methods are also discussed and given in detail

    Sensor Fusion for Localization of Automated Guided Vehicles

    Get PDF
    Automated Guided Vehicles (AGVs) need to localize themselves reliably in order to perform their tasks efficiently. To that end, they rely on noisy sensor measurements that potentially provide erroneous location estimates if they are used directly. To prevent this issue, measurements from different kinds of sensors are generally used together. This thesis presents a Kalman Filter based sensor fusion approach that is able to function with asynchronous measurements from laser scanners, odometry and Inertial Measurement Units (IMUs). The method uses general kinematic equations for state prediction that work with any type of vehicle kinematics and utilizes state augmentation to estimate gyroscope and accelerometer biases. The developed algorithm was tested with an open source multisensor navigation dataset and real-time experiments with an AGV. In both sets of experiments, scenarios in which the laser scanner was fully available, partially available or not available were compared. It was found that using sensor fusion resulted in a smaller deviation from the actual trajectory compared to using only a laser scanner. Furthermore, in each experiment, using sensor fusion decreased the localization error in the time periods where the laser was unavailable, although the amount of improvement depended on the duration of unavailability and motion characteristic
    corecore