5 research outputs found

    RoutineSense: A Mobile Sensing Framework for the Reconstruction of User Routines

    Get PDF
    Modern smartphones are powerful platforms that have become part of the everyday life for most people. Thanks to their sensing and computing capabilities, smartphones can unobtrusively identify simple user states (e.g., location, performed activity, etc.), enabling a plethora of applications that provide insights on the lifestyle of the users. In this paper, we introduce routineSense: a system for the automatic reconstruction of complex daily routines from simple user states, implemented as an incremental processing framework. Such framework combines opportunistic sensing and user feedback to discover frequent and exceptional routines that can be used to segment and aggregate multiple user activities in a timeline. We use a comprehensive dataset containing rich geographic information to assess the feasibility and performance of routineSense, showing a near threefold improvement on the current state-of-the-art

    Supporting Location-Aware Services for Mobile Users with the Whereabouts Diary

    No full text
    Modern handheld devices provided with localization capabilities could be used to automatically create a diary of user’s whereabouts, and use it as a complement of the user profile in many applications. In this paper we present the Whereabouts diary, an application/service to log the places visited by the user and to automatically label them with descriptive semantic information. In particular, Web-retrieved data and the temporal patterns with which different places are visited can be used to automatically define such meaningful semantic labels. We describe the general idea at the basis of the whereabouts diary and discuss our implementation and associated experimental results. In addition, we illustrate several applications that can fruitfully exploit the whereabouts diary as a supporting service, and discuss areas for future work

    Complex event recognition through wearable sensors

    Get PDF
    Complex events are instrumental in understanding advanced behaviours and properties of a system. They can represent more meaningful events as compared to simple events. In this thesis we propose to use wearable sensor signals to detect complex events. These signals are pertaining to the user's state and therefore allow us to understand advanced characteristics about her. We propose a hierarchical approach to detect simple events from the wearable sensors data and then build complex events on top of them. In order to address privacy concerns that rise from the use of sensitive signals, we propose to perform all the computation on device. While this ensures the privacy of the data, it poses the problem of having limited computational resources. This problem is tackled by introducing energy efficient approaches based on incremental algorithms. A second challenge is the multiple levels of noise in the process. A first level of noise concerns the raw signals that are inherently imprecise (e.g. inaccuracy in GPS readings). A second level of noise, that we call semantic noise, is present among the simple events detected. Some of these simple events can disturb the detection of complex events effectively acting as noise. We apply the hierarchical approach in two different contexts defining the two different parts of our thesis. In the first part, we present a mobile system that builds a representation of the user's life. This system is based on the episodic memory model, which is responsible for the storage and recollection of past experiences. Following the hierarchical approach, the system processes raw signals to detect simple events such as places where the user stayed a certain amount of time to perform an activity, therefore building sequences of detected activities. These activities are in turn processed to detect complex events that we call routines and that represent recurrent patterns in the life of the user. In the second part of this thesis, we focus on the detection of glycemic events for diabetes type-1 patients in a non-invasive manner. Diabetics are not able to properly regulate their glucose, leading to periods of high and low blood sugar. We leverage signals (Electrocardiogram (ECG), accelerometer, breathing rate) from a sport belt to infer such glycemic events. We propose a physiological model based on the variations of the ECG when the patient has low blood sugar, and an energy-based model that computes the current glucose level of the user based on her glucose intake, insulin intake and glucose consumption via physical activity. For both contexts, we evaluate our systems in term of accuracy by assessing wether the detected routines are meaningful, and wether the glycemic events are correctly detected, and in term of mobile performance, which confirms the fitness of our approaches for mobile computation
    corecore