50 research outputs found

    Multimodal Data Fusion based on the Global Workspace Theory

    Get PDF
    We propose a novel neural network architecture, named the Global Workspace Network (GWN), which addresses the challenge of dynamic and unspecified uncertainties in multimodal data fusion. Our GWN is a model of attention across modalities and evolving through time, and is inspired by the well-established Global Workspace Theory from the field of cognitive science. The GWN achieved average F1 score of 0.92 for discrimination between pain patients and healthy participants and average F1 score = 0.75 for further classification of three pain levels for a patient, both based on the multimodal EmoPain dataset captured from people with chronic pain and healthy people performing different types of exercise movements in unconstrained settings. In these tasks, the GWN significantly outperforms the typical fusion approach of merging by concatenation. We further provide extensive analysis of the behaviour of the GWN and its ability to address uncertainties (hidden noise) in multimodal data

    P(l)aying Attention: Multi-Modal, Multi-Temporal Music Control

    Get PDF

    Functional Musical Sonification for Chronic Pain Support

    Get PDF
    Chronic pain causes substantial disability, and people living with chronic pain often use protective behaviours and movements to minimize pain and worry about exacerbating pain during everyday activities such as loading the washing machine. We present work in progress on ubiquitous interactive sonification of body movement to help people with chronic pain to understand and positively modify their movements in clinical and functional situations. The sonification blends informational and aesthetic aspects and is intended for daily use

    Touch Technology in Affective Human, Robot, Virtual-Human Interactions: A Survey

    Get PDF
    Given the importance of affective touch in human interactions, technology designers are increasingly attempting to bring this modality to the core of interactive technology. Advances in haptics and touch-sensing technology have been critical to fostering interest in this area. In this survey, we review how affective touch is investigated to enhance and support the human experience with or through technology. We explore this question across three different research areas to highlight their epistemology, main findings, and the challenges that persist. First, we review affective touch technology through the human–computer interaction literature to understand how it has been applied to the mediation of human–human interaction and its roles in other human interactions particularly with oneself, augmented objects/media, and affect-aware devices. We further highlight the datasets and methods that have been investigated for automatic detection and interpretation of affective touch in this area. In addition, we discuss the modalities of affective touch expressions in both humans and technology in these interactions. Second, we separately review how affective touch has been explored in human–robot and real-human–virtual-human interactions where the technical challenges encountered and the types of experience aimed at are different. We conclude with a discussion of the gaps and challenges that emerge from the review to steer research in directions that are critical for advancing affective touch technology and recognition systems. In our discussion, we also raise ethical issues that should be considered for responsible innovation in this growing area

    Pain level and pain-related behaviour classification using GRU-based sparsely-connected RNNs

    Get PDF

    Social touch gesture recognition using random forest and boosting on distinct feature sets

    Get PDF
    Touch is a primary nonverbal communication channel used to communicate emotions or other social messages. Despite its importance, this channel is still very little explored in the affective computing field, as much more focus has been placed on visual and aural channels. In this paper, we investigate the possibility to automatically discriminate between different social touch types. We propose five distinct feature sets for describing touch behaviours captured by a grid of pressure sensors. These features are then combined together by using the Random Forest and Boosting methods for categorizing the touch gesture type. The proposed methods were evaluated on both the HAART (7 gesture types over different surfaces) and the CoST (14 gesture types over the same surface) datasets made available by the Social Touch Gesture Challenge 2015. Well above chance level performances were achieved with a 67% accuracy for the HAART and 59% for the CoST testing datasets respectively

    Bridging the gap between emotion and joint action

    Get PDF
    Our daily human life is filled with a myriad of joint action moments, be it children playing, adults working together (i.e., team sports), or strangers navigating through a crowd. Joint action brings individuals (and embodiment of their emotions) together, in space and in time. Yet little is known about how individual emotions propagate through embodied presence in a group, and how joint action changes individual emotion. In fact, the multi-agent component is largely missing from neuroscience-based approaches to emotion, and reversely joint action research has not found a way yet to include emotion as one of the key parameters to model socio-motor interaction. In this review, we first identify the gap and then stockpile evidence showing strong entanglement between emotion and acting together from various branches of sciences. We propose an integrative approach to bridge the gap, highlight five research avenues to do so in behavioral neuroscience and digital sciences, and address some of the key challenges in the area faced by modern societies

    12-Eth­oxy-2,3,8,9-tetra­methoxy­benzo[c]phenanthridine dichloro­methane solvate

    Get PDF
    The title compound, C23H23NO5·CH2Cl2, was obtained via the alkyl­ation of the 12-hydr­oxy-2,3,8,9-tetra­methoxy­benzo[c]phenanthridine salt. The benzo[c]phenanthridine ring system is essentially planar, with a mean out-of-plane deviation of 0.026 Å. A dicloromethane mol­ecule of solvation is present and located between the sheets of phenanthridine mol­ecules, preventing any significant inter­molecular hydrogen-bonding or π–π inter­actions

    How Can Affect Be Detected and Represented in Technological Support for Physical Rehabilitation?

    Get PDF
    Although clinical best practice suggests that affect awareness could enable more effective technological support for physical rehabilitation through personalisation to psychological needs, designers need to consider what affective states matter and how they should be tracked and addressed. In this paper, we set the standard by analysing how the major affective factors in chronic pain (pain, fear/anxiety, and low/depressed mood) interfere with everyday physical functioning. Further, based on discussion of the modality that should be used to track these states to enable technology to address them, we investigated the possibility of using movement behaviour to automatically detect the states. Using two body movement datasets on people with chronic pain, we show that movement behaviour enables very good discrimination between two emotional distress levels (F1=0.86), and three pain levels (F1=0.9). Performance remained high (F1=0.78 for two pain levels) with a reduced set of movement sensors. Finally, in an overall discussion, we suggest how technology-provided encouragement and awareness can be personalised given the capability to automatically monitor the relevant states, towards addressing the barriers that they pose. In addition, we highlight movement behaviour features to be tracked to provide technology with information necessary for such personalisation
    corecore