1,322 research outputs found

    Evaluating indoor positioning systems in a shopping mall : the lessons learned from the IPIN 2018 competition

    Get PDF
    The Indoor Positioning and Indoor Navigation (IPIN) conference holds an annual competition in which indoor localization systems from different research groups worldwide are evaluated empirically. The objective of this competition is to establish a systematic evaluation methodology with rigorous metrics both for real-time (on-site) and post-processing (off-site) situations, in a realistic environment unfamiliar to the prototype developers. For the IPIN 2018 conference, this competition was held on September 22nd, 2018, in Atlantis, a large shopping mall in Nantes (France). Four competition tracks (two on-site and two off-site) were designed. They consisted of several 1 km routes traversing several floors of the mall. Along these paths, 180 points were topographically surveyed with a 10 cm accuracy, to serve as ground truth landmarks, combining theodolite measurements, differential global navigation satellite system (GNSS) and 3D scanner systems. 34 teams effectively competed. The accuracy score corresponds to the third quartile (75th percentile) of an error metric that combines the horizontal positioning error and the floor detection. The best results for the on-site tracks showed an accuracy score of 11.70 m (Track 1) and 5.50 m (Track 2), while the best results for the off-site tracks showed an accuracy score of 0.90 m (Track 3) and 1.30 m (Track 4). These results showed that it is possible to obtain high accuracy indoor positioning solutions in large, realistic environments using wearable light-weight sensors without deploying any beacon. This paper describes the organization work of the tracks, analyzes the methodology used to quantify the results, reviews the lessons learned from the competition and discusses its future

    Indoor navigation for the visually impaired : enhancements through utilisation of the Internet of Things and deep learning

    Get PDF
    Wayfinding and navigation are essential aspects of independent living that heavily rely on the sense of vision. Walking in a complex building requires knowing exact location to find a suitable path to the desired destination, avoiding obstacles and monitoring orientation and movement along the route. People who do not have access to sight-dependent information, such as that provided by signage, maps and environmental cues, can encounter challenges in achieving these tasks independently. They can rely on assistance from others or maintain their independence by using assistive technologies and the resources provided by smart environments. Several solutions have adapted technological innovations to combat navigation in an indoor environment over the last few years. However, there remains a significant lack of a complete solution to aid the navigation requirements of visually impaired (VI) people. The use of a single technology cannot provide a solution to fulfil all the navigation difficulties faced. A hybrid solution using Internet of Things (IoT) devices and deep learning techniques to discern the patterns of an indoor environment may help VI people gain confidence to travel independently. This thesis aims to improve the independence and enhance the journey of VI people in an indoor setting with the proposed framework, using a smartphone. The thesis proposes a novel framework, Indoor-Nav, to provide a VI-friendly path to avoid obstacles and predict the user s position. The components include Ortho-PATH, Blue Dot for VI People (BVIP), and a deep learning-based indoor positioning model. The work establishes a novel collision-free pathfinding algorithm, Orth-PATH, to generate a VI-friendly path via sensing a grid-based indoor space. Further, to ensure correct movement, with the use of beacons and a smartphone, BVIP monitors the movements and relative position of the moving user. In dark areas without external devices, the research tests the feasibility of using sensory information from a smartphone with a pre-trained regression-based deep learning model to predict the user s absolute position. The work accomplishes a diverse range of simulations and experiments to confirm the performance and effectiveness of the proposed framework and its components. The results show that Indoor-Nav is the first type of pathfinding algorithm to provide a novel path to reflect the needs of VI people. The approach designs a path alongside walls, avoiding obstacles, and this research benchmarks the approach with other popular pathfinding algorithms. Further, this research develops a smartphone-based application to test the trajectories of a moving user in an indoor environment

    An Indoor Navigation System Using a Sensor Fusion Scheme on Android Platform

    Get PDF
    With the development of wireless communication networks, smart phones have become a necessity for people’s daily lives, and they meet not only the needs of basic functions for users such as sending a message or making a phone call, but also the users’ demands for entertainment, surfing the Internet and socializing. Navigation functions have been commonly utilized, however the navigation function is often based on GPS (Global Positioning System) in outdoor environments, whereas a number of applications need to navigate indoors. This paper presents a system to achieve high accurate indoor navigation based on Android platform. To do this, we design a sensor fusion scheme for our system. We divide the system into three main modules: distance measurement module, orientation detection module and position update module. We use an efficient way to estimate the stride length and use step sensor to count steps in distance measurement module. For orientation detection module, in order to get the optimal result of orientation, we then introduce Kalman filter to de-noise the data collected from different sensors. In the last module, we combine the data from the previous modules and calculate the current location. Results of experiments show that our system works well and has high accuracy in indoor situations

    Multi sensor system for pedestrian tracking and activity recognition in indoor environments

    Get PDF
    The widespread use of mobile devices and the rise of Global Navigation Satellite Systems (GNSS) have allowed mobile tracking applications to become very popular and valuable in outdoor environments. However, tracking pedestrians in indoor environments with Global Positioning System (GPS)-based schemes is still very challenging. Along with indoor tracking, the ability to recognize pedestrian behavior and activities can lead to considerable growth in location-based applications including pervasive healthcare, leisure and guide services (such as, hospitals, museums, airports, etc.), and emergency services, among the most important ones. This paper presents a system for pedestrian tracking and activity recognition in indoor environments using exclusively common off-the-shelf sensors embedded in smartphones (accelerometer, gyroscope, magnetometer and barometer). The proposed system combines the knowledge found in biomechanical patterns of the human body while accomplishing basic activities, such as walking or climbing stairs up and down, along with identifiable signatures that certain indoor locations (such as turns or elevators) introduce on sensing data. The system was implemented and tested on Android-based mobile phones. The system detects and counts steps with an accuracy of 97% and 96:67% in flat floor and stairs, respectively; detects user changes of direction and altitude with 98:88% and 96:66% accuracy, respectively; and recognizes the proposed human activities with a 95% accuracy. All modules combined lead to a total tracking accuracy of 91:06% in common human motion indoor displacement
    corecore