5,841 research outputs found

    Emotions in context: examining pervasive affective sensing systems, applications, and analyses

    Get PDF
    Pervasive sensing has opened up new opportunities for measuring our feelings and understanding our behavior by monitoring our affective states while mobile. This review paper surveys pervasive affect sensing by examining and considering three major elements of affective pervasive systems, namely; “sensing”, “analysis”, and “application”. Sensing investigates the different sensing modalities that are used in existing real-time affective applications, Analysis explores different approaches to emotion recognition and visualization based on different types of collected data, and Application investigates different leading areas of affective applications. For each of the three aspects, the paper includes an extensive survey of the literature and finally outlines some of challenges and future research opportunities of affective sensing in the context of pervasive computing

    A Wireless Future: performance art, interaction and the brain-computer interfaces

    Get PDF
    Although the use of Brain-Computer Interfaces (BCIs) in the arts originates in the 1960s, there is a limited number of known applications in the context of real-time audio-visual and mixed-media performances and accordingly the knowledge base of this area has not been developed sufficiently. Among the reasons are the difficulties and the unknown parameters involved in the design and implementation of the BCIs. However today, with the dissemination of the new wireless devices, the field is rapidly growing and changing. In this frame, we examine a selection of representative works and artists, in comparison to the current scientific evidence. We identify important performative and neuroscientific aspects, issues and challenges. A model of possible interactions between the performers and the audience is discussed and future trends regarding liveness and interconnectivity are suggested

    Assessing the quality of steady-state visual-evoked potentials for moving humans using a mobile electroencephalogram headset.

    Get PDF
    Recent advances in mobile electroencephalogram (EEG) systems, featuring non-prep dry electrodes and wireless telemetry, have enabled and promoted the applications of mobile brain-computer interfaces (BCIs) in our daily life. Since the brain may behave differently while people are actively situated in ecologically-valid environments versus highly-controlled laboratory environments, it remains unclear how well the current laboratory-oriented BCI demonstrations can be translated into operational BCIs for users with naturalistic movements. Understanding inherent links between natural human behaviors and brain activities is the key to ensuring the applicability and stability of mobile BCIs. This study aims to assess the quality of steady-state visual-evoked potentials (SSVEPs), which is one of promising channels for functioning BCI systems, recorded using a mobile EEG system under challenging recording conditions, e.g., walking. To systematically explore the effects of walking locomotion on the SSVEPs, this study instructed subjects to stand or walk on a treadmill running at speeds of 1, 2, and 3 mile (s) per hour (MPH) while concurrently perceiving visual flickers (11 and 12 Hz). Empirical results of this study showed that the SSVEP amplitude tended to deteriorate when subjects switched from standing to walking. Such SSVEP suppression could be attributed to the walking locomotion, leading to distinctly deteriorated SSVEP detectability from standing (84.87 ± 13.55%) to walking (1 MPH: 83.03 ± 13.24%, 2 MPH: 79.47 ± 13.53%, and 3 MPH: 75.26 ± 17.89%). These findings not only demonstrated the applicability and limitations of SSVEPs recorded from freely behaving humans in realistic environments, but also provide useful methods and techniques for boosting the translation of the BCI technology from laboratory demonstrations to practical applications

    Creating Bio-adaptive Visual Cues for a Social Virtual Reality Meditation Environment

    Get PDF
    This thesis examines designing and implementing adaptive visual cues for a social virtual reality meditation environment. The system described here adapts into user’s bio- and neurofeedback and uses that data in visual cues to convey information of physiological and affective states during meditation exercises supporting two simultaneous users. The thesis shows the development process of different kinds of visual cues and attempts to pinpoint best practices, design principles and pitfalls regarding the visual cue development in this context. Also examined are the questions regarding criteria for selecting correct visual cues and how to convey information of biophysical synchronization between users. The visual cues examined here are created especially for a virtual reality environment which differs as a platform from traditional two dimensional content such as user interfaces on a computer display. Points of interests are how to embody the visual cues into the virtual reality environment so that the user experience remains immersive and the visual cues convey information correctly and in an intuitive manner
    corecore