1,122 research outputs found

    Evaluating indoor positioning systems in a shopping mall : the lessons learned from the IPIN 2018 competition

    Get PDF
    The Indoor Positioning and Indoor Navigation (IPIN) conference holds an annual competition in which indoor localization systems from different research groups worldwide are evaluated empirically. The objective of this competition is to establish a systematic evaluation methodology with rigorous metrics both for real-time (on-site) and post-processing (off-site) situations, in a realistic environment unfamiliar to the prototype developers. For the IPIN 2018 conference, this competition was held on September 22nd, 2018, in Atlantis, a large shopping mall in Nantes (France). Four competition tracks (two on-site and two off-site) were designed. They consisted of several 1 km routes traversing several floors of the mall. Along these paths, 180 points were topographically surveyed with a 10 cm accuracy, to serve as ground truth landmarks, combining theodolite measurements, differential global navigation satellite system (GNSS) and 3D scanner systems. 34 teams effectively competed. The accuracy score corresponds to the third quartile (75th percentile) of an error metric that combines the horizontal positioning error and the floor detection. The best results for the on-site tracks showed an accuracy score of 11.70 m (Track 1) and 5.50 m (Track 2), while the best results for the off-site tracks showed an accuracy score of 0.90 m (Track 3) and 1.30 m (Track 4). These results showed that it is possible to obtain high accuracy indoor positioning solutions in large, realistic environments using wearable light-weight sensors without deploying any beacon. This paper describes the organization work of the tracks, analyzes the methodology used to quantify the results, reviews the lessons learned from the competition and discusses its future

    Map matching by using inertial sensors: literature review

    Get PDF
    This literature review aims to clarify what is known about map matching by using inertial sensors and what are the requirements for map matching, inertial sensors, placement and possible complementary position technology. The target is to develop a wearable location system that can position itself within a complex construction environment automatically with the aid of an accurate building model. The wearable location system should work on a tablet computer which is running an augmented reality (AR) solution and is capable of track and visualize 3D-CAD models in real environment. The wearable location system is needed to support the system in initialization of the accurate camera pose calculation and automatically finding the right location in the 3D-CAD model. One type of sensor which does seem applicable to people tracking is inertial measurement unit (IMU). The IMU sensors in aerospace applications, based on laser based gyroscopes, are big but provide a very accurate position estimation with a limited drift. Small and light units such as those based on Micro-Electro-Mechanical (MEMS) sensors are becoming very popular, but they have a significant bias and therefore suffer from large drifts and require method for calibration like map matching. The system requires very little fixed infrastructure, the monetary cost is proportional to the number of users, rather than to the coverage area as is the case for traditional absolute indoor location systems.Siirretty Doriast

    Integrating Haptic Feedback into Mobile Location Based Services

    Get PDF
    Haptics is a feedback technology that takes advantage of the human sense of touch by applying forces, vibrations, and/or motions to a haptic-enabled device such as a mobile phone. Historically, human-computer interaction has been visual - text and images on the screen. Haptic feedback can be an important additional method especially in Mobile Location Based Services such as knowledge discovery, pedestrian navigation and notification systems. A knowledge discovery system called the Haptic GeoWand is a low interaction system that allows users to query geo-tagged data around them by using a point-and-scan technique with their mobile device. Haptic Pedestrian is a navigation system for walkers. Four prototypes have been developed classified according to the user’s guidance requirements, the user type (based on spatial skills), and overall system complexity. Haptic Transit is a notification system that provides spatial information to the users of public transport. In all these systems, haptic feedback is used to convey information about location, orientation, density and distance by use of the vibration alarm with varying frequencies and patterns to help understand the physical environment. Trials elicited positive responses from the users who see benefit in being provided with a “heads up” approach to mobile navigation. Results from a memory recall test show that the users of haptic feedback for navigation had better memory recall of the region traversed than the users of landmark images. Haptics integrated into a multi-modal navigation system provides more usable, less distracting but more effective interaction than conventional systems. Enhancements to the current work could include integration of contextual information, detailed large-scale user trials and the exploration of using haptics within confined indoor spaces

    Advanced Map Matching Technologies and Techniques for Pedestrian/Wheelchair Navigation

    Get PDF
    Due to the constantly increasing technical advantages of mobile devices (such as smartphones), pedestrian/wheelchair navigation recently has achieved a high level of interest as one of smartphones’ potential mobile applications. While vehicle navigation systems have already reached a certain level of maturity, pedestrian/wheelchair navigation services are still in their infancy. By comparing vehicle navigation systems, a set of map matching requirements and challenges unique in pedestrian/wheelchair navigation is identified. To provide navigation assistance to pedestrians and wheelchair users, there is a need for the design and development of new map matching techniques. The main goal of this research is to investigate and develop advanced map matching technologies and techniques particular for pedestrian/wheelchair navigation services. As the first step in map matching, an adaptive candidate segment selection algorithm is developed to efficiently find candidate segments. Furthermore, to narrow down the search for the correct segment, advanced mathematical models are applied. GPS-based chain-code map matching, Hidden Markov Model (HMM) map matching, and fuzzy-logic map matching algorithms are developed to estimate real-time location of users in pedestrian/wheelchair navigation systems/services. Nevertheless, GPS signal is not always available in areas with high-rise buildings and even when there is a signal, the accuracy may not be high enough for localization of pedestrians and wheelchair users on sidewalks. To overcome these shortcomings of GPS, multi-sensor integrated map matching algorithms are investigated and developed in this research. These algorithms include a movement pattern recognition algorithm, using accelerometer and compass data, and a vision-based positioning algorithm to fill in signal gaps in GPS positioning. Experiments are conducted to evaluate the developed algorithms using real field test data (GPS coordinates and other sensors data). The experimental results show that the developed algorithms and the integrated sensors, i.e., a monocular visual odometry, a GPS, an accelerometer, and a compass, can provide high-quality and uninterrupted localization services in pedestrian/wheelchair navigation systems/services. The map matching techniques developed in this work can be applied to various pedestrian/wheelchair navigation applications, such as tracking senior citizens and children, or tourist service systems, and can be further utilized in building walking robots and automatic wheelchair navigation systems

    Tactile Displays for Pedestrian Navigation

    Get PDF
    Existing pedestrian navigation systems are mainly visual-based, sometimes with an addition of audio guidance. However, previous research has reported that visual-based navigation systems require a high level of cognitive efforts, contributing to errors and delays. Furthermore, in many situations a person’s visual and auditory channels may be compromised due to environmental factors or may be occupied by other important tasks. Some research has suggested that the tactile sense can effectively be used for interfaces to support navigation tasks. However, many fundamental design and usability issues with pedestrian tactile navigation displays are yet to be investigated. This dissertation investigates human-computer interaction aspects associated with the design of tactile pedestrian navigation systems. More specifically, it addresses the following questions: What may be appropriate forms of wearable devices? What types of spatial information should such systems provide to pedestrians? How do people use spatial information for different navigation purposes? How can we effectively represent such information via tactile stimuli? And how do tactile navigation systems perform? A series of empirical studies was carried out to (1) investigate the effects of tactile signal properties and manipulation on the human perception of spatial data, (2) find out the effective form of wearable displays for navigation tasks, and (3) explore a number of potential tactile representation techniques for spatial data, specifically representing directions and landmarks. Questionnaires and interviews were used to gather information on the use of landmarks amongst people navigating urban environments for different purposes. Analysis of the results of these studies provided implications for the design of tactile pedestrian navigation systems, which we incorporated in a prototype. Finally, field trials were carried out to evaluate the design and address usability issues and performance-related benefits and challenges. The thesis develops an understanding of how to represent spatial information via the tactile channel and provides suggestions for the design and implementation of tactile pedestrian navigation systems. In addition, the thesis classifies the use of various types of landmarks for different navigation purposes. These contributions are developed throughout the thesis building upon an integrated series of empirical studies.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    A Conceptual Model of Exploration Wayfinding: An Integrated Theoretical Framework and Computational Methodology

    Get PDF
    This thesis is an attempt to integrate contending cognitive approaches to modeling wayfinding behavior. The primary goal is to create a plausible model for exploration tasks within indoor environments. This conceptual model can be extended for practical applications in the design, planning, and Social sciences. Using empirical evidence a cognitive schema is designed that accounts for perceptual and behavioral preferences in pedestrian navigation. Using this created schema, as a guiding framework, the use of network analysis and space syntax act as a computational methods to simulate human exploration wayfinding in unfamiliar indoor environments. The conceptual model provided is then implemented in two ways. First of which is by updating an existing agent-based modeling software directly. The second means of deploying the model is using a spatial interaction model that distributed visual attraction and movement permeability across a graph-representation of building floor plans
    corecore