2 research outputs found

    Leveraging Activity Recognition to Enable Protective Behavior Detection in Continuous Data

    Get PDF
    Protective behavior exhibited by people with chronic pain (CP) during physical activities is the key to understanding their physical and emotional states. Existing automatic protective behavior detection (PBD) methods rely on pre-segmentation of activities predefined by users. However, in real life, people perform activities casually. Therefore, where those activities present difficulties for people with chronic pain, technology-enabled support should be delivered continuously and automatically adapted to activity type and occurrence of protective behavior. Hence, to facilitate ubiquitous CP management, it becomes critical to enable accurate PBD over continuous data. In this paper, we propose to integrate human activity recognition (HAR) with PBD via a novel hierarchical HAR-PBD architecture comprising graph-convolution and long short-term memory (GC-LSTM) networks, and alleviate class imbalances using a class-balanced focal categorical-cross-entropy (CFCC) loss. Through in-depth evaluation of the approach using a CP patients' dataset, we show that the leveraging of HAR, GC-LSTM networks, and CFCC loss leads to clear increase in PBD performance against the baseline (macro F1 score of 0.81 vs. 0.66 and precision-recall area-under-the-curve (PR-AUC) of 0.60 vs. 0.44). We conclude by discussing possible use cases of the hierarchical architecture in CP management and beyond. We also discuss current limitations and ways forward.Comment: Submitted to PACM IMWU

    Automatic recognition of multiple affective states in virtual rehabilitation by exploiting the dependency relationships

    No full text
    The automatic recognition of multiple affective states can be enhanced if the underpinning computational models explicitly consider the interactions between the states. This work proposes a computational model that incorporates the dependencies between four states (tiredness, anxiety, pain, and engagement)known to appear in virtual rehabilitation sessions of post-stroke patients, to improve the automatic recognition of the patients' states. A dataset of five stroke patients which includes their fingers' pressure (PRE), hand movements (MOV)and facial expressions (FAE)during ten sessions of virtual rehabilitation was used. Our computational proposal uses the Semi-Naive Bayesian classifier (SNBC)as base classifier in a multiresolution approach to create a multimodal model with the three sensors (PRE, MOV, and FAE)with late fusion using SNBC (FSNB classifier). There is a FSNB classifier for each state, and they are linked in a circular classifier chain (CCC)to exploit the dependency relationships between the states. Results of CCC are over 90% of ROC AUC for the four states. Relationships of mutual exclusion between engagement and all the other states and some co-occurrences between pain and anxiety for the five patients were detected. Virtual rehabilitation platforms that incorporate the automatic recognition of multiple patient's states could leverage intelligent and empathic interactions to promote adherence to rehabilitation exercises
    corecore