5 research outputs found

    BNCI Horizon 2020 - Towards a Roadmap for Brain/Neural Computer Interaction

    Get PDF
    In this paper, we present BNCI Horizon 2020, an EU Coordination and Support Action (CSA) that will provide a roadmap for brain-computer interaction research for the next years, starting in 2013, and aiming at research efforts until 2020 and beyond. The project is a successor of the earlier EU-funded Future BNCI CSA that started in 2010 and produced a roadmap for a shorter time period. We present how we, a consortium of the main European BCI research groups as well as companies and end user representatives, expect to tackle the problem of designing a roadmap for BCI research. In this paper, we define the field with its recent developments, in particular by considering publications and EU-funded research projects, and we discuss how we plan to involve research groups, companies, and user groups in our effort to pave the way for useful and fruitful EU-funded BCI research for the next ten years

    Affective Brain-Computer Interfaces Neuroscientific Approaches to Affect Detection

    Get PDF
    The brain is involved in the registration, evaluation, and representation of emotional events, and in the subsequent planning and execution of adequate actions. Novel interface technologies – so-called affective brain-computer interfaces (aBCI) - can use this rich neural information, occurring in response to affective stimulation, for the detection of the affective state of the user. This chapter gives an overview of the promises and challenges that arise from the possibility of neurophysiology-based affect detection, with a special focus on electrophysiological signals. After outlining the potential of aBCI relative to other sensing modalities, the reader is introduced to the neurophysiological and neurotechnological background of this interface technology. Potential application scenarios are situated in a general framework of brain-computer interfaces. Finally, the main scientific and technological challenges that have to be solved on the way toward reliable affective brain-computer interfaces are discussed

    Multi-modal Affect Induction for Affective Brain-Computer Interfaces

    No full text
    Reliable applications of affective brain-computer interfaces (aBCI) in realistic, multi-modal environments require a detailed understanding of the processes involved in emotions. To explore the modalityspecific nature of affective responses, we studied neurophysiological responses (i.e., EEG) of 24 participants during visual, auditory, and audiovisual affect stimulation. The affect induction protocols were validated by participants’ subjective ratings and physiological responses (i.e., ECG). Coherent with literature, we found modality-specific responses in the EEG: posterior alpha power decreases during visual stimulation and increases during auditory stimulation, anterior alpha power tends to decrease during auditory stimulation and to increase during visual stimulation. We discuss the implications of these results for multi-modal aBCI

    Multi-modal affect induction for affective brain-computer interfaces

    No full text
    Reliable applications of affective brain-computer interfaces (aBCI) in realistic, multi-modal environments require a detailed understanding of the processes involved in emotions. To explore the modality-specific nature of affective responses, we studied neurophysiological responses (i.e., EEG) of 24 participants during visual, auditory, and audiovisual affect stimulation. The affect induction protocols were validated by participants’ subjective ratings and physiological responses (i.e., ECG). Coherent with literature, we found modality-specific responses in the EEG: posterior alpha power decreases during visual stimulation and increases during auditory stimulation, anterior alpha power tends to decrease during auditory stimulation and to increase during visual stimulation. We discuss the implications of these results for multi-modal aBCI

    Affective state recognition in Virtual Reality from electromyography and photoplethysmography using head-mounted wearable sensors.

    Get PDF
    The three core components of Affective Computing (AC) are emotion expression recognition, emotion processing, and emotional feedback. Affective states are typically characterized in a two-dimensional space consisting of arousal, i.e., the intensity of the emotion felt; and valence, i.e., the degree to which the current emotion is pleasant or unpleasant. These fundamental properties of emotion can not only be measured using subjective ratings from users, but also with the help of physiological and behavioural measures, which potentially provide an objective evaluation across users. Multiple combinations of measures are utilised in AC for a range of applications, including education, healthcare, marketing, and entertainment. As the uses of immersive Virtual Reality (VR) technologies are growing, there is a rapidly increasing need for robust affect recognition in VR settings. However, the integration of affect detection methodologies with VR remains an unmet challenge due to constraints posed by the current VR technologies, such as Head Mounted Displays. This EngD project is designed to overcome some of the challenges by effectively integrating valence and arousal recognition methods in VR technologies and by testing their reliability in seated and room-scale full immersive VR conditions. The aim of this EngD research project is to identify how affective states are elicited in VR and how they can be efficiently measured, without constraining the movement and decreasing the sense of presence in the virtual world. Through a three-years long collaboration with Emteq labs Ltd, a wearable technology company, we assisted in the development of a novel multimodal affect detection system, specifically tailored towards the requirements of VR. This thesis will describe the architecture of the system, the research studies that enabled this development, and the future challenges. The studies conducted, validated the reliability of our proposed system, including the VR stimuli design, data measures and processing pipeline. This work could inform future studies in the field of AC in VR and assist in the development of novel applications and healthcare interventions
    corecore