8,011 research outputs found
Learning Bodily and Temporal Attention in Protective Movement Behavior Detection
For people with chronic pain, the assessment of protective behavior during
physical functioning is essential to understand their subjective pain-related
experiences (e.g., fear and anxiety toward pain and injury) and how they deal
with such experiences (avoidance or reliance on specific body joints), with the
ultimate goal of guiding intervention. Advances in deep learning (DL) can
enable the development of such intervention. Using the EmoPain MoCap dataset,
we investigate how attention-based DL architectures can be used to improve the
detection of protective behavior by capturing the most informative temporal and
body configurational cues characterizing specific movements and the strategies
used to perform them. We propose an end-to-end deep learning architecture named
BodyAttentionNet (BANet). BANet is designed to learn temporal and bodily parts
that are more informative to the detection of protective behavior. The approach
addresses the variety of ways people execute a movement (including healthy
people) independently of the type of movement analyzed. Through extensive
comparison experiments with other state-of-the-art machine learning techniques
used with motion capture data, we show statistically significant improvements
achieved by using these attention mechanisms. In addition, the BANet
architecture requires a much lower number of parameters than the state of the
art for comparable if not higher performances.Comment: 7 pages, 3 figures, 2 tables, code available, accepted in ACII 201
Protective Behavior Detection in Chronic Pain Rehabilitation: From Data Preprocessing to Learning Model
Chronic pain (CP) rehabilitation extends beyond physiotherapist-directed clinical sessions and primarily functions in people's everyday lives. Unfortunately, self-directed rehabilitation is difficult because patients need to deal with both their pain and the mental barriers that pain imposes on routine functional activities. Physiotherapists adjust patients' exercise plans and advice in clinical sessions based on the amount of protective behavior (i.e., a sign of anxiety about movement) displayed by the patient. The goal of such modifications is to assist patients in overcoming their fears and maintaining physical functioning. Unfortunately, physiotherapists' support is absent during self-directed rehabilitation or also called self-management that people conduct in their daily life.
To be effective, technology for chronic-pain self-management should be able to detect protective behavior to facilitate personalized support. Thereon, this thesis addresses the key challenges of ubiquitous automatic protective behavior detection (PBD). Our investigation takes advantage of an available dataset (EmoPain) containing movement and muscle activity data of healthy people and people with CP engaged in typical everyday activities. To begin, we examine the data augmentation methods and segmentation parameters using various vanilla neural networks in order to enable activity-independent PBD within pre-segmented activity instances. Second, by incorporating temporal and bodily attention mechanisms, we improve PBD performance and support theoretical/clinical understanding of protective behavior that the attention of a person with CP shifts between body parts perceived as risky during feared movements. Third, we use human activity recognition (HAR) to improve continuous PBD in data of various activity types. The approaches proposed above are validated against the ground truth established by majority voting from expert annotators. Unfortunately, using such majority-voted ground truth causes information loss, whereas direct learning from all annotators is vulnerable to noise from disagreements. As the final study, we improve the learning from multiple annotators by leveraging the agreement information for regularization
Chronic-Pain Protective Behavior Detection with Deep Learning
In chronic pain rehabilitation, physiotherapists adapt physical activity to
patients' performance based on their expression of protective behavior,
gradually exposing them to feared but harmless and essential everyday
activities. As rehabilitation moves outside the clinic, technology should
automatically detect such behavior to provide similar support. Previous works
have shown the feasibility of automatic protective behavior detection (PBD)
within a specific activity. In this paper, we investigate the use of deep
learning for PBD across activity types, using wearable motion capture and
surface electromyography data collected from healthy participants and people
with chronic pain. We approach the problem by continuously detecting protective
behavior within an activity rather than estimating its overall presence. The
best performance reaches mean F1 score of 0.82 with leave-one-subject-out cross
validation. When protective behavior is modelled per activity type, performance
is mean F1 score of 0.77 for bend-down, 0.81 for one-leg-stand, 0.72 for
sit-to-stand, 0.83 for stand-to-sit, and 0.67 for reach-forward. This
performance reaches excellent level of agreement with the average experts'
rating performance suggesting potential for personalized chronic pain
management at home. We analyze various parameters characterizing our approach
to understand how the results could generalize to other PBD datasets and
different levels of ground truth granularity.Comment: 24 pages, 12 figures, 7 tables. Accepted by ACM Transactions on
Computing for Healthcar
Leveraging Activity Recognition to Enable Protective Behavior Detection in Continuous Data
Protective behavior exhibited by people with chronic pain (CP) during
physical activities is the key to understanding their physical and emotional
states. Existing automatic protective behavior detection (PBD) methods rely on
pre-segmentation of activities predefined by users. However, in real life,
people perform activities casually. Therefore, where those activities present
difficulties for people with chronic pain, technology-enabled support should be
delivered continuously and automatically adapted to activity type and
occurrence of protective behavior. Hence, to facilitate ubiquitous CP
management, it becomes critical to enable accurate PBD over continuous data. In
this paper, we propose to integrate human activity recognition (HAR) with PBD
via a novel hierarchical HAR-PBD architecture comprising graph-convolution and
long short-term memory (GC-LSTM) networks, and alleviate class imbalances using
a class-balanced focal categorical-cross-entropy (CFCC) loss. Through in-depth
evaluation of the approach using a CP patients' dataset, we show that the
leveraging of HAR, GC-LSTM networks, and CFCC loss leads to clear increase in
PBD performance against the baseline (macro F1 score of 0.81 vs. 0.66 and
precision-recall area-under-the-curve (PR-AUC) of 0.60 vs. 0.44). We conclude
by discussing possible use cases of the hierarchical architecture in CP
management and beyond. We also discuss current limitations and ways forward.Comment: Submitted to PACM IMWU
Wearable performance
This is the post-print version of the article. The official published version can be accessed from the link below - Copyright @ 2009 Taylor & FrancisWearable computing devices worn on the body provide the potential for digital interaction in the world. A new stage of computing technology at the beginning of the 21st Century links the personal and the pervasive through mobile wearables. The convergence between the miniaturisation of microchips (nanotechnology), intelligent textile or interfacial materials production, advances in biotechnology and the growth of wireless, ubiquitous computing emphasises not only mobility but integration into clothing or the human body. In artistic contexts one expects such integrated wearable devices to have the two-way function of interface instruments (e.g. sensor data acquisition and exchange) worn for particular purposes, either for communication with the environment or various aesthetic and compositional expressions. 'Wearable performance' briefly surveys the context for wearables in the performance arts and distinguishes display and performative/interfacial garments. It then focuses on the authors' experiments with 'design in motion' and digital performance, examining prototyping at the DAP-Lab which involves transdisciplinary convergences between fashion and dance, interactive system architecture, electronic textiles, wearable technologies and digital animation. The concept of an 'evolving' garment design that is materialised (mobilised) in live performance between partners originates from DAP Lab's work with telepresence and distributed media addressing the 'connective tissues' and 'wearabilities' of projected bodies through a study of shared embodiment and perception/proprioception in the wearer (tactile sensory processing). Such notions of wearability are applied both to the immediate sensory processing on the performer's body and to the processing of the responsive, animate environment. Wearable computing devices worn on the body provide the potential for digital interaction in the world. A new stage of computing technology at the beginning of the 21st Century links the personal and the pervasive through mobile wearables. The convergence between the miniaturisation of microchips (nanotechnology), intelligent textile or interfacial materials production, advances in biotechnology and the growth of wireless, ubiquitous computing emphasises not only mobility but integration into clothing or the human body. In artistic contexts one expects such integrated wearable devices to have the two-way function of interface instruments (e.g. sensor data acquisition and exchange) worn for particular purposes, either for communication with the environment or various aesthetic and compositional expressions. 'Wearable performance' briefly surveys the context for wearables in the performance arts and distinguishes display and performative/interfacial garments. It then focuses on the authors' experiments with 'design in motion' and digital performance, examining prototyping at the DAP-Lab which involves transdisciplinary convergences between fashion and dance, interactive system architecture, electronic textiles, wearable technologies and digital animation. The concept of an 'evolving' garment design that is materialised (mobilised) in live performance between partners originates from DAP Lab's work with telepresence and distributed media addressing the 'connective tissues' and 'wearabilities' of projected bodies through a study of shared embodiment and perception/proprioception in the wearer (tactile sensory processing). Such notions of wearability are applied both to the immediate sensory processing on the performer's body and to the processing of the responsive, animate environment
From early markers to neuro-developmental mechanisms of autism
A fast growing field, the study of infants at risk because of having an older sibling with autism (i.e. infant sibs) aims to identify the earliest signs of this disorder, which would allow for earlier diagnosis and intervention. More importantly, we argue, these studies offer the opportunity to validate existing neuro-developmental models of autism against experimental evidence. Although autism is mainly seen as a disorder of social interaction and communication, emerging early markers do not exclusively reflect impairments of the “social brain”. Evidence for atypical development of sensory and attentional systems highlight the need to move away from localized deficits to models suggesting brain-wide involvement in autism pathology. We discuss the implications infant sibs findings have for future work into the biology of autism and the development of interventions
Neuroarquitectura: percepción de cambios de la atmósfera
[Neuroarchitecture: perception of changes in the atmosphere
Pain level and pain-related behaviour classification using GRU-based sparsely-connected RNNs
There is a growing body of studies on applying deep learning to biometrics
analysis. Certain circumstances, however, could impair the objective measures
and accuracy of the proposed biometric data analysis methods. For instance,
people with chronic pain (CP) unconsciously adapt specific body movements to
protect themselves from injury or additional pain. Because there is no
dedicated benchmark database to analyse this correlation, we considered one of
the specific circumstances that potentially influence a person's biometrics
during daily activities in this study and classified pain level and
pain-related behaviour in the EmoPain database. To achieve this, we proposed a
sparsely-connected recurrent neural networks (s-RNNs) ensemble with the gated
recurrent unit (GRU) that incorporates multiple autoencoders using a shared
training framework. This architecture is fed by multidimensional data collected
from inertial measurement unit (IMU) and surface electromyography (sEMG)
sensors. Furthermore, to compensate for variations in the temporal dimension
that may not be perfectly represented in the latent space of s-RNNs, we fused
hand-crafted features derived from information-theoretic approaches with
represented features in the shared hidden state. We conducted several
experiments which indicate that the proposed method outperforms the
state-of-the-art approaches in classifying both pain level and pain-related
behaviour
- …