32,380 research outputs found

    Mapping Abstract Visual Feedback to a Dimensional Model of Emotion

    Get PDF
    Recent HCI research has looked at conveying emotions through non-visual modalities, such as vibrotactile and thermal feedback. However, emotion is primarily conveyed through visual signals, and so this research aims to support the design of emotional visual feedback. We adapt and extend the design of the "pulsing amoeba" [29], and measure the emotion conveyed through the abstract visual designs. It is a first step towards more holistic multimodal affective feedback combining visual, auditory and tactile stimuli. An online survey garnered valence and arousal ratings of 32 stimuli that varied in colour, contour, pulse size and pulse speed. The results support previous research but also provide new findings and highlight the effects of each individual visual parameter on perceived emotion. We present a mapping of all stimulus combinations onto the common two-dimensional valence-arousal model of emotion

    Multimodal Affective Feedback: Combining Thermal, Vibrotactile, Audio and Visual Signals

    Get PDF
    In this paper we describe a demonstration of our multimodal affective feedback designs, used in research to expand the emotional expressivity of interfaces. The feedback leverages inherent associations and reactions to thermal, vibrotactile, auditory and abstract visual designs to convey a range of affective states without any need for learning feedback encoding. All combinations of the different feedback channels can be utilised, depending on which combination best conveys a given state. All the signals are generated from a mobile phone augmented with thermal and vibrotactile stimulators, which will be available to conference visitors to see, touch, hear and, importantly, feel

    Multi-Moji: Combining Thermal, Vibrotactile and Visual Stimuli to Expand the Affective Range of Feedback

    Get PDF
    This paper explores the combination of multiple concurrent modalities for conveying emotional information in HCI: temperature, vibration and abstract visual displays. Each modality has been studied individually, but can only convey a limited range of emotions within two-dimensional valencearousal space. This paper is the first to systematically combine multiple modalities to expand the available affective range. Three studies were conducted: Study 1 measured the emotionality of vibrotactile feedback by itself; Study 2 measured the perceived emotional content of three bimodal combinations: vibrotactile + thermal, vibrotactile + visual and visual + thermal. Study 3 then combined all three modalities. Results show that combining modalities increases the available range of emotional states, particularly in the problematic top-right and bottom-left quadrants of the dimensional model. We also provide a novel lookup resource for designers to identify stimuli to convey a range of emotions

    Generative theatre of totality

    Get PDF
    Generative art can be used for creating complex multisensory and multimedia experiences within predetermined aesthetic parameters, characteristic of the performing arts and remarkably suitable to address Moholy-Nagy's Theatre of Totality vision. In generative artworks the artist will usually take on the role of an experience framework designer, and the system evolves freely within that framework and its defined aesthetic boundaries. Most generative art impacts visual arts, music and literature, but there does not seem to be any relevant work exploring the cross-medium potential, and one could confidently state that most generative art outcomes are abstract and visual, or audio. It is the goal of this article to propose a model for the creation of generative performances within the Theatre of Totality's scope, derived from stochastic Lindenmayer systems, where mapping techniques are proposed to address the seven variables addressed by Moholy-Nagy: light, space, plane, form, motion, sound and man ("man" is replaced in this article with "human", except where quoting from the author), with all the inherent complexities

    Affective Facial Expression Processing via Simulation: A Probabilistic Model

    Full text link
    Understanding the mental state of other people is an important skill for intelligent agents and robots to operate within social environments. However, the mental processes involved in `mind-reading' are complex. One explanation of such processes is Simulation Theory - it is supported by a large body of neuropsychological research. Yet, determining the best computational model or theory to use in simulation-style emotion detection, is far from being understood. In this work, we use Simulation Theory and neuroscience findings on Mirror-Neuron Systems as the basis for a novel computational model, as a way to handle affective facial expressions. The model is based on a probabilistic mapping of observations from multiple identities onto a single fixed identity (`internal transcoding of external stimuli'), and then onto a latent space (`phenomenological response'). Together with the proposed architecture we present some promising preliminary resultsComment: Annual International Conference on Biologically Inspired Cognitive Architectures - BICA 201

    Interactions between visceral afferent signaling and stimulus processing

    Get PDF
    Visceral afferent signals to the brain influence thoughts, feelings and behaviour. Here we highlight the findings of a set of empirical investigations in humans concerning body-mind interaction that focus on how feedback from states of autonomic arousal shapes cognition and emotion. There is a longstanding debate regarding the contribution of the body, to mental processes. Recent theoretical models broadly acknowledge the role of (autonomically mediated) physiological arousal to emotional, social and motivational behaviours, yet the underlying mechanisms are only partially characterized. Neuroimaging is overcoming this shortfall; first, by demonstrating correlations between autonomic change and discrete patterns of evoked, and task- independent, neural activity; second, by mapping the central consequences of clinical perturbations in autonomic response and; third, by probing how dynamic fluctuations in peripheral autonomic state are integrated with perceptual, cognitive and emotional processes. Building on the notion that an important source of the brain’s representation of physiological arousal is derived from afferent information from arterial baroreceptors, we have exploited the phasic nature of these signals to show their differential contribution to the processing of emotionally-salient stimuli. This recent work highlights the facilitation at neural and behavioral levels of fear and threat processing that contrasts with the more established observations of the inhibition of central pain processing during baroreceptors activation. The implications of this body-brain-mind axis are discussed
    • …
    corecore