8,534 research outputs found

    Technologies and solutions for location-based services in smart cities: past, present, and future

    Get PDF
    Location-based services (LBS) in smart cities have drastically altered the way cities operate, giving a new dimension to the life of citizens. LBS rely on location of a device, where proximity estimation remains at its core. The applications of LBS range from social networking and marketing to vehicle-toeverything communications. In many of these applications, there is an increasing need and trend to learn the physical distance between nearby devices. This paper elaborates upon the current needs of proximity estimation in LBS and compares them against the available Localization and Proximity (LP) finding technologies (LP technologies in short). These technologies are compared for their accuracies and performance based on various different parameters, including latency, energy consumption, security, complexity, and throughput. Hereafter, a classification of these technologies, based on various different smart city applications, is presented. Finally, we discuss some emerging LP technologies that enable proximity estimation in LBS and present some future research areas

    Joint received signal strength, angle-of-arrival, and time-of-flight positioning

    Get PDF
    This paper presents a software positioning framework that is able to jointly use measured values of three parameters: the received signal strength, the angle-of-arrival, and the time-of-flight of the wireless signals. Based on experimentally determined measurement accuracies of these three parameters, results of a realistic simulation scenario are presented. It is shown that for the given configuration, angle-of-arrival and received signal strength measurements benefit from a hybrid system that combines both. Thanks to their higher accuracy, time-of-flight systems perform significantly better, and obtain less added value from a combination with the other two parameters

    Fireground location understanding by semantic linking of visual objects and building information models

    Get PDF
    This paper presents an outline for improved localization and situational awareness in fire emergency situations based on semantic technology and computer vision techniques. The novelty of our methodology lies in the semantic linking of video object recognition results from visual and thermal cameras with Building Information Models (BIM). The current limitations and possibilities of certain building information streams in the context of fire safety or fire incident management are addressed in this paper. Furthermore, our data management tools match higher-level semantic metadata descriptors of BIM and deep-learning based visual object recognition and classification networks. Based on these matches, estimations can be generated of camera, objects and event positions in the BIM model, transforming it from a static source of information into a rich, dynamic data provider. Previous work has already investigated the possibilities to link BIM and low-cost point sensors for fireground understanding, but these approaches did not take into account the benefits of video analysis and recent developments in semantics and feature learning research. Finally, the strengths of the proposed approach compared to the state-of-the-art is its (semi -)automatic workflow, generic and modular setup and multi-modal strategy, which allows to automatically create situational awareness, to improve localization and to facilitate the overall fire understanding
    corecore