467 research outputs found
Mobility increases localizability: A survey on wireless indoor localization using inertial sensors
Wireless indoor positioning has been extensively studied for the past 2 decades and continuously attracted growing research efforts in mobile computing context. As the integration of multiple inertial sensors (e.g., accelerometer, gyroscope, and magnetometer) to nowadays smartphones in recent years, human-centric mobility sensing is emerging and coming into vogue. Mobility information, as a new dimension in addition to wireless signals, can benefit localization in a number of ways, since location and mobility are by nature related in the physical world. In this article, we survey this new trend of mobility enhancing smartphone-based indoor localization. Specifically, we first study how to measure human mobility: what types of sensors we can use and what types of mobility information we can acquire. Next, we discuss how mobility assists localization with respect to enhancing location accuracy, decreasing deployment cost, and enriching location context. Moreover, considering the quality and cost of smartphone built-in sensors, handling measurement errors is essential and accordingly investigated. Combining existing work and our own working experiences, we emphasize the principles and conduct comparative study of the mainstream technologies. Finally, we conclude this survey by addressing future research directions and opportunities in this new and largely open area.</jats:p
Positioning Techniques with Smartphone Technology: Performances and Methodologies in Outdoor and Indoor Scenarios
Smartphone technology is widespread both in the academy and in the commercial world. Almost every people have today a smartphone in their pocket, that are not only used to call other people but also to share their location on social networks or to plan activities. Today with a smartphone we can compute our position using the sensors settled inside the device that may also include accelerometers, gyroscopes and magnetometers, teslameter, proximity sensors, barometer, and GPS/GNSS chipset. In this chapter we want to analyze the state-of-the-art of the positioning with smartphone technology, considering both outdoor and indoor scenarios. Particular attention will be paid to this last situation, where the accuracy can be improved fusing information coming from more than one sensor. In particular, we will investigate an innovative method of image recognition based (IRB) technology, particularly useful in GNSS denied environment, taking into account the two main problems that arise when the IRB positioning methods are considered: the first one is the optimization of the battery, that implies the minimization of the frame rate, and secondly the latencies due to image processing for visual search solutions, required by the size of the database with the 3D environment images
It's the Human that Matters: Accurate User Orientation Estimation for Mobile Computing Applications
Ubiquity of Internet-connected and sensor-equipped portable devices sparked a
new set of mobile computing applications that leverage the proliferating
sensing capabilities of smart-phones. For many of these applications, accurate
estimation of the user heading, as compared to the phone heading, is of
paramount importance. This is of special importance for many crowd-sensing
applications, where the phone can be carried in arbitrary positions and
orientations relative to the user body. Current state-of-the-art focus mainly
on estimating the phone orientation, require the phone to be placed in a
particular position, require user intervention, and/or do not work accurately
indoors; which limits their ubiquitous usability in different applications. In
this paper we present Humaine, a novel system to reliably and accurately
estimate the user orientation relative to the Earth coordinate system.
Humaine requires no prior-configuration nor user intervention and works
accurately indoors and outdoors for arbitrary cell phone positions and
orientations relative to the user body. The system applies statistical analysis
techniques to the inertial sensors widely available on today's cell phones to
estimate both the phone and user orientation. Implementation of the system on
different Android devices with 170 experiments performed at different indoor
and outdoor testbeds shows that Humaine significantly outperforms the
state-of-the-art in diverse scenarios, achieving a median accuracy of
averaged over a wide variety of phone positions. This is
better than the-state-of-the-art. The accuracy is bounded by the error in the
inertial sensors readings and can be enhanced with more accurate sensors and
sensor fusion.Comment: Accepted for publication in the 11th International Conference on
Mobile and Ubiquitous Systems: Computing, Networking and Services
(Mobiquitous 2014
Recommended from our members
Driver and Passenger Identification from Smartphone Data
The objective of this paper is twofold. First, it presents a brief overview of existing driver and passenger identification or recognition approaches which rely on smartphone data. This includes listing the typically available sensory measurements and highlighting a few key practical considerations for automotive settings. Second, a simple identification method that utilises the smartphone inertial measurements and, possibly, doors signal is proposed. It is based on analysing the user behaviour during entry, namely the direction of turning, and extracting relevant salient features, which are distinctive depending on the side of entry to the vehicle. This is followed by applying a suitable classifier and decision criterion. Experimental data is shown to demonstrate the usefulness and effectiveness of the introduced probabilistic, low-complexity, identification technique.Jaguar Land Rover under the Centre for Advanced Photonics
and Electronics (CAPE) agreement
From data acquisition to data fusion : a comprehensive review and a roadmap for the identification of activities of daily living using mobile devices
This paper focuses on the research on the state of the art for sensor fusion techniques, applied to the sensors embedded in mobile devices, as a means to help identify the mobile device user’s daily activities. Sensor data fusion techniques are used to consolidate the data collected from several sensors, increasing the reliability of the algorithms for the identification of the different activities. However, mobile devices have several constraints, e.g., low memory, low battery life and low processing power, and some data fusion techniques are not suited to this scenario. The main purpose of this paper is to present an overview of the state of the art to identify examples of sensor data fusion techniques that can be applied to the sensors available in mobile devices aiming to identify activities of daily living (ADLs)
Providing location everywhere
Anacleto R., Figueiredo L., Novais P., Almeida A., Providing Location Everywhere, in Progress in Artificial Intelligence, Antunes L., Sofia Pinto H. (eds), Lecture Notes in Artificial Intelligence 7026, Springer-Verlag, ISBN 978-3-540-24768-2, (Proceedings of the 15th Portuguese conference on Artificial Intelligence - EPIA 2011, Lisboa, Portugal), pp 15-28, 2011.The ability to locate an individual is an essential part of many applications, specially the mobile ones. Obtaining this location
in an open environment is relatively simple through GPS (Global Positioning System), but indoors or even in dense environments this type of
location system doesn’t provide a good accuracy. There are already systems that try to suppress these limitations, but most of them need the
existence of a structured environment to work. Since Inertial Navigation Systems (INS) try to suppress the need of a structured environment we
propose an INS based on Micro Electrical Mechanical Systems (MEMS) that is capable of, in real time, compute the position of an individual everywhere
Map matching by using inertial sensors: literature review
This literature review aims to clarify what is known about map matching by
using inertial sensors and what are the requirements for map matching, inertial
sensors, placement and possible complementary position technology. The target
is to develop a wearable location system that can position itself within a complex
construction environment automatically with the aid of an accurate building model.
The wearable location system should work on a tablet computer which is running
an augmented reality (AR) solution and is capable of track and visualize 3D-CAD
models in real environment. The wearable location system is needed to support the
system in initialization of the accurate camera pose calculation and automatically
finding the right location in the 3D-CAD model. One type of sensor which does seem
applicable to people tracking is inertial measurement unit (IMU). The IMU sensors
in aerospace applications, based on laser based gyroscopes, are big but provide a
very accurate position estimation with a limited drift. Small and light units such
as those based on Micro-Electro-Mechanical (MEMS) sensors are becoming very
popular, but they have a significant bias and therefore suffer from large drifts and
require method for calibration like map matching. The system requires very little
fixed infrastructure, the monetary cost is proportional to the number of users, rather
than to the coverage area as is the case for traditional absolute indoor location
systems.Siirretty Doriast
Estimation of User's Orientation via Wearable UWB
User's orientation in indoor environments is an important part of her context. Orientation can be useful to understand what the user is looking at, and thus to improve the interaction between her and the surrounding environment. In this paper, we present a method based on wearable UWB-enabled devices. The position of the devices in space is used to estimate the user's orientation. We experimentally evaluated the impact of some operational parameters, such as the distance between worn devices, or some environmental conditions, such as the position of the user in the room. Results show that the accuracy of the method suits the needs of a wide range of practical purposes
Robust localization with wearable sensors
Measuring physical movements of humans and understanding human behaviour is useful in a variety of areas and disciplines. Human inertial tracking is a method that can be leveraged for monitoring complex actions that emerge from interactions between human actors and their environment. An accurate estimation of motion trajectories can support new approaches to pedestrian navigation, emergency rescue, athlete management, and medicine. However, tracking with wearable inertial sensors has several problems that need to be overcome, such as the low accuracy of consumer-grade inertial measurement units (IMUs), the error accumulation problem in long-term tracking, and the artefacts generated by movements that are less common. This thesis focusses on measuring human movements with wearable head-mounted sensors to accurately estimate the physical location of a person over time. The research consisted of (i) providing an overview of the current state of research for inertial tracking with wearable sensors, (ii) investigating the performance of new tracking algorithms that combine sensor fusion and data-driven machine learning, (iii) eliminating the effect of random head motion during tracking, (iv) creating robust long-term tracking systems with a Bayesian neural network and sequential Monte Carlo method, and (v) verifying that the system can be applied with changing modes of behaviour, defined as natural transitions from walking to running and vice versa. This research introduces a new system for inertial tracking with head-mounted sensors (which can be placed in, e.g. helmets, caps, or glasses). This technology can be used for long-term positional tracking to explore complex behaviours
- …