7,262 research outputs found

    Context-awareness for mobile sensing: a survey and future directions

    Get PDF
    The evolution of smartphones together with increasing computational power have empowered developers to create innovative context-aware applications for recognizing user related social and cognitive activities in any situation and at any location. The existence and awareness of the context provides the capability of being conscious of physical environments or situations around mobile device users. This allows network services to respond proactively and intelligently based on such awareness. The key idea behind context-aware applications is to encourage users to collect, analyze and share local sensory knowledge in the purpose for a large scale community use by creating a smart network. The desired network is capable of making autonomous logical decisions to actuate environmental objects, and also assist individuals. However, many open challenges remain, which are mostly arisen due to the middleware services provided in mobile devices have limited resources in terms of power, memory and bandwidth. Thus, it becomes critically important to study how the drawbacks can be elaborated and resolved, and at the same time better understand the opportunities for the research community to contribute to the context-awareness. To this end, this paper surveys the literature over the period of 1991-2014 from the emerging concepts to applications of context-awareness in mobile platforms by providing up-to-date research and future research directions. Moreover, it points out the challenges faced in this regard and enlighten them by proposing possible solutions

    A Public Domain Dataset for Real-Life Human Activity Recognition Using Smartphone Sensors

    Get PDF
    [Abstract] In recent years, human activity recognition has become a hot topic inside the scientific community. The reason to be under the spotlight is its direct application in multiple domains, like healthcare or fitness. Additionally, the current worldwide use of smartphones makes it particularly easy to get this kind of data from people in a non-intrusive and cheaper way, without the need for other wearables. In this paper, we introduce our orientation-independent, placement-independent and subject-independent human activity recognition dataset. The information in this dataset is the measurements from the accelerometer, gyroscope, magnetometer, and GPS of the smartphone. Additionally, each measure is associated with one of the four possible registered activities: inactive, active, walking and driving. This work also proposes asupport vector machine (SVM) model to perform some preliminary experiments on the dataset. Considering that this dataset was taken from smartphones in their actual use, unlike other datasets, the development of a good model on such data is an open problem and a challenge for researchers. By doing so, we would be able to close the gap between the model and a real-life application.This research was partially funded by Xunta de Galicia/FEDER-UE (ConectaPeme, GEMA: IN852A 2018/14), MINECO-AEI/FEDER-UE (Flatcity: TIN2016-77158-C4-3-R) and Xunta de Galicia/FEDER-UE (AXUDAS PARA A CONSOLIDACION E ESTRUTURACION DE UNIDADES DE INVESTIGACION COMPETITIVAS.GRC: ED431C 2017/58 and ED431C 2018/49)Xunta de Galicia; IN852A 2018/14Xunta de Galicia; ED431C 2017/58Xunta de Galicia; ED431C 2018/4

    Transportation mode recognition fusing wearable motion, sound and vision sensors

    Get PDF
    We present the first work that investigates the potential of improving the performance of transportation mode recognition through fusing multimodal data from wearable sensors: motion, sound and vision. We first train three independent deep neural network (DNN) classifiers, which work with the three types of sensors, respectively. We then propose two schemes that fuse the classification results from the three mono-modal classifiers. The first scheme makes an ensemble decision with fixed rules including Sum, Product, Majority Voting, and Borda Count. The second scheme is an adaptive fuser built as another classifier (including Naive Bayes, Decision Tree, Random Forest and Neural Network) that learns enhanced predictions by combining the outputs from the three mono-modal classifiers. We verify the advantage of the proposed method with the state-of-the-art Sussex-Huawei Locomotion and Transportation (SHL) dataset recognizing the eight transportation activities: Still, Walk, Run, Bike, Bus, Car, Train and Subway. We achieve F1 scores of 79.4%, 82.1% and 72.8% with the mono-modal motion, sound and vision classifiers, respectively. The F1 score is remarkably improved to 94.5% and 95.5% by the two data fusion schemes, respectively. The recognition performance can be further improved with a post-processing scheme that exploits the temporal continuity of transportation. When assessing generalization of the model to unseen data, we show that while performance is reduced - as expected - for each individual classifier, the benefits of fusion are retained with performance improved by 15 percentage points. Besides the actual performance increase, this work, most importantly, opens up the possibility for dynamically fusing modalities to achieve distinct power-performance trade-off at run time

    Wearables for independent living in older adults: Gait and falls

    Get PDF
    Solutions are needed to satisfy care demands of older adults to live independently. Wearable technology (wearables) is one approach that offers a viable means for ubiquitous, sustainable and scalable monitoring of the health of older adults in habitual free-living environments. Gait has been presented as a relevant (bio)marker in ageing and pathological studies, with objective assessment achievable by inertial-based wearables. Commercial wearables have struggled to provide accurate analytics and have been limited by non-clinically oriented gait outcomes. Moreover, some research-grade wearables also fail to provide transparent functionality due to limitations in proprietary software. Innovation within this field is often sporadic, with large heterogeneity of wearable types and algorithms for gait outcomes leading to a lack of pragmatic use. This review provides a summary of the recent literature on gait assessment through the use of wearables, focusing on the need for an algorithm fusion approach to measurement, culminating in the ability to better detect and classify falls. A brief presentation of wearables in one pathological group is presented, identifying appropriate work for researchers in other cohorts to utilise. Suggestions for how this domain needs to progress are also summarised

    Introducing a Human Activity Recognition Dataset Gathered on Real-Life Conditions

    Get PDF
    Cursos e Congresos, C-155[Abstract] Human activity recognition (HAR) has garnered significant scientific interest in recent years. The widespread use of smartphones enabled convenient and cost-effective data collection, eliminating the need for additional wearables. Given that, this paper introduces a novel HAR dataset in which participants had freedom in choosing smartphone orientation and placement during activities, ensuring data variability. It also includes contributions from diverse individuals, reflecting unique smartphone usage habits. Moreover, it comprises measurements from accelerometer, gyroscope, magnetometer, and GPS, corresponding to one of four activities: inactive, active, walking, or driving. Unlike other datasets, the collected data in this study were obtained from smartphones used in real-life scenariosThis work was funded by CITIC is funded by the Xunta de Galicia through the collaboration agreement between the Consellería de Cultura, Educación, Formación Profesional e Universidades and the Galician universities for the reinforcement of the research centres of the Galician University System (CIGUS), Xunta de Galicia/FEDER-UE (ConectaPeme, GEMA: IN852A 2018/14), MINECO-AEI/FEDER-UE (Flatcity: TIN2016-77158-C4-3-R) and Xunta de Galicia/FEDER-UE (AXUDAS PARA A CONSOLIDACION E ESTRUTURACION DE UNIDADES DE INVESTIGACION COMPETITIVAS.GRC: ED431C 2017/58 and ED431C 2018/49).Xunta de Galicia; ED431C 2017/58Xunta de Galicia; ED431C 2018/4
    corecore