7,813 research outputs found

    Multi-Sensor Context-Awareness in Mobile Devices and Smart Artefacts

    Get PDF
    The use of context in mobile devices is receiving increasing attention in mobile and ubiquitous computing research. In this article we consider how to augment mobile devices with awareness of their environment and situation as context. Most work to date has been based on integration of generic context sensors, in particular for location and visual context. We propose a different approach based on integration of multiple diverse sensors for awareness of situational context that can not be inferred from location, and targeted at mobile device platforms that typically do not permit processing of visual context. We have investigated multi-sensor context-awareness in a series of projects, and report experience from development of a number of device prototypes. These include development of an awareness module for augmentation of a mobile phone, of the Mediacup exemplifying context-enabled everyday artifacts, and of the Smart-Its platform for aware mobile devices. The prototypes have been explored in various applications to validate the multi-sensor approach to awareness, and to develop new perspectives of how embedded context-awareness can be applied in mobile and ubiquitous computing

    Cooperative Relative Positioning of Mobile Users by Fusing IMU Inertial and UWB Ranging Information

    Full text link
    Relative positioning between multiple mobile users is essential for many applications, such as search and rescue in disaster areas or human social interaction. Inertial-measurement unit (IMU) is promising to determine the change of position over short periods of time, but it is very sensitive to error accumulation over long term run. By equipping the mobile users with ranging unit, e.g. ultra-wideband (UWB), it is possible to achieve accurate relative positioning by trilateration-based approaches. As compared to vision or laser-based sensors, the UWB does not need to be with in line-of-sight and provides accurate distance estimation. However, UWB does not provide any bearing information and the communication range is limited, thus UWB alone cannot determine the user location without any ambiguity. In this paper, we propose an approach to combine IMU inertial and UWB ranging measurement for relative positioning between multiple mobile users without the knowledge of the infrastructure. We incorporate the UWB and the IMU measurement into a probabilistic-based framework, which allows to cooperatively position a group of mobile users and recover from positioning failures. We have conducted extensive experiments to demonstrate the benefits of incorporating IMU inertial and UWB ranging measurements.Comment: accepted by ICRA 201

    From data acquisition to data fusion : a comprehensive review and a roadmap for the identification of activities of daily living using mobile devices

    Get PDF
    This paper focuses on the research on the state of the art for sensor fusion techniques, applied to the sensors embedded in mobile devices, as a means to help identify the mobile device user’s daily activities. Sensor data fusion techniques are used to consolidate the data collected from several sensors, increasing the reliability of the algorithms for the identification of the different activities. However, mobile devices have several constraints, e.g., low memory, low battery life and low processing power, and some data fusion techniques are not suited to this scenario. The main purpose of this paper is to present an overview of the state of the art to identify examples of sensor data fusion techniques that can be applied to the sensors available in mobile devices aiming to identify activities of daily living (ADLs)

    InContexto: Multisensor Architecture to Obtain People Context from Smartphones

    Get PDF
    The way users intectact with smartphones is changing after the improvements made in their embedded sensors. Increasingly, these devices are being employed as tools to observe individuals habits. Smartphones provide a great set of embedded sensors, such as accelerometer, digital compass, gyroscope, GPS, microphone, and camera. This paper aims to describe a distributed architecture, called inContexto, to recognize user context information using mobile phones. Moreover, it aims to infer physical actions performed by users such as walking, running, and still. Sensory data is collected by HTC magic application made in Android OS, and it was tested achieving about 97% of accuracy classifying five different actions (still, walking and running).This work was supported in part by Projects CICYT TIN2011-28620-C02-01, CICYT TEC2011-28626-C02-02, CAM CONTEXTS (S2009/TIC-1485), and DPS2008-07029- C02-02.Publicad

    Multisensor-based human detection and tracking for mobile service robots

    Get PDF
    The one of fundamental issues for service robots is human-robot interaction. In order to perform such a task and provide the desired services, these robots need to detect and track people in the surroundings. In the present paper, we propose a solution for human tracking with a mobile robot that implements multisensor data fusion techniques. The system utilizes a new algorithm for laser-based legs detection using the on-board LRF. The approach is based on the recognition of typical leg patterns extracted from laser scans, which are shown to be very discriminative also in cluttered environments. These patterns can be used to localize both static and walking persons, even when the robot moves. Furthermore, faces are detected using the robot's camera and the information is fused to the legs position using a sequential implementation of Unscented Kalman Filter. The proposed solution is feasible for service robots with a similar device configuration and has been successfully implemented on two different mobile platforms. Several experiments illustrate the effectiveness of our approach, showing that robust human tracking can be performed within complex indoor environments

    Miniature mobile sensor platforms for condition monitoring of structures

    Get PDF
    In this paper, a wireless, multisensor inspection system for nondestructive evaluation (NDE) of materials is described. The sensor configuration enables two inspection modes-magnetic (flux leakage and eddy current) and noncontact ultrasound. Each is designed to function in a complementary manner, maximizing the potential for detection of both surface and internal defects. Particular emphasis is placed on the generic architecture of a novel, intelligent sensor platform, and its positioning on the structure under test. The sensor units are capable of wireless communication with a remote host computer, which controls manipulation and data interpretation. Results are presented in the form of automatic scans with different NDE sensors in a series of experiments on thin plate structures. To highlight the advantage of utilizing multiple inspection modalities, data fusion approaches are employed to combine data collected by complementary sensor systems. Fusion of data is shown to demonstrate the potential for improved inspection reliability

    Evaluating indoor positioning systems in a shopping mall : the lessons learned from the IPIN 2018 competition

    Get PDF
    The Indoor Positioning and Indoor Navigation (IPIN) conference holds an annual competition in which indoor localization systems from different research groups worldwide are evaluated empirically. The objective of this competition is to establish a systematic evaluation methodology with rigorous metrics both for real-time (on-site) and post-processing (off-site) situations, in a realistic environment unfamiliar to the prototype developers. For the IPIN 2018 conference, this competition was held on September 22nd, 2018, in Atlantis, a large shopping mall in Nantes (France). Four competition tracks (two on-site and two off-site) were designed. They consisted of several 1 km routes traversing several floors of the mall. Along these paths, 180 points were topographically surveyed with a 10 cm accuracy, to serve as ground truth landmarks, combining theodolite measurements, differential global navigation satellite system (GNSS) and 3D scanner systems. 34 teams effectively competed. The accuracy score corresponds to the third quartile (75th percentile) of an error metric that combines the horizontal positioning error and the floor detection. The best results for the on-site tracks showed an accuracy score of 11.70 m (Track 1) and 5.50 m (Track 2), while the best results for the off-site tracks showed an accuracy score of 0.90 m (Track 3) and 1.30 m (Track 4). These results showed that it is possible to obtain high accuracy indoor positioning solutions in large, realistic environments using wearable light-weight sensors without deploying any beacon. This paper describes the organization work of the tracks, analyzes the methodology used to quantify the results, reviews the lessons learned from the competition and discusses its future

    inContexto: A Fusion Architecture to Obtain Mobile Context

    Get PDF
    Proceedings of: 14th International Conference on Information Fusion (FUSION 2011), Chicago, Illinois, USA, 5-8 July 2011Thanks to the embedded sensors providing in mobile devices will revolutionize the way to carry out with. Mobile devices provide a set of embedded sensors, such as accelerometer, digital compass, gyroscope, GPS, microphone, and camera. Another point to consider, is that mobile devices are easily programmable since an API was included by the OS companies. This paper aims to describe a distributed architecture, called inContexto, to recognize physical actions perfomed by users such as walking, running, being stand, sitting and also retrieve context information from the user. Sensory data is collected by HTC magic application made in Android OS.This work was supported in part by Projects CICYT TIN2008-06742-C02-02/TSI, CICYT TEC2008-06732-C02- 02/TEC,CAM CONTEXTS (S2009/ TIC-1485) and DPS2008- 07029-C02-02.Publicad
    corecore