455 research outputs found

    A Wireless Future: performance art, interaction and the brain-computer interfaces

    Get PDF
    Although the use of Brain-Computer Interfaces (BCIs) in the arts originates in the 1960s, there is a limited number of known applications in the context of real-time audio-visual and mixed-media performances and accordingly the knowledge base of this area has not been developed sufficiently. Among the reasons are the difficulties and the unknown parameters involved in the design and implementation of the BCIs. However today, with the dissemination of the new wireless devices, the field is rapidly growing and changing. In this frame, we examine a selection of representative works and artists, in comparison to the current scientific evidence. We identify important performative and neuroscientific aspects, issues and challenges. A model of possible interactions between the performers and the audience is discussed and future trends regarding liveness and interconnectivity are suggested

    Affective Brain-Computer Interfaces Neuroscientific Approaches to Affect Detection

    Get PDF
    The brain is involved in the registration, evaluation, and representation of emotional events, and in the subsequent planning and execution of adequate actions. Novel interface technologies – so-called affective brain-computer interfaces (aBCI) - can use this rich neural information, occurring in response to affective stimulation, for the detection of the affective state of the user. This chapter gives an overview of the promises and challenges that arise from the possibility of neurophysiology-based affect detection, with a special focus on electrophysiological signals. After outlining the potential of aBCI relative to other sensing modalities, the reader is introduced to the neurophysiological and neurotechnological background of this interface technology. Potential application scenarios are situated in a general framework of brain-computer interfaces. Finally, the main scientific and technological challenges that have to be solved on the way toward reliable affective brain-computer interfaces are discussed

    Personalised, multi-modal, affective state detection for hybrid brain-computer music interfacing

    Get PDF
    Brain-computer music interfaces (BCMIs) may be used to modulate affective states, with applications in music therapy, composition, and entertainment. However, for such systems to work they need to be able to reliably detect their user's current affective state. We present a method for personalised affective state detection for use in BCMI. We compare it to a population-based detection method trained on 17 users and demonstrate that personalised affective state detection is significantly ( p<0.01p<0.01p<0.01 ) more accurate, with average improvements in accuracy of 10.2 percent for valence and 9.3 percent for arousal. We also compare a hybrid BCMI (a BCMI that combines physiological signals with neurological signals) to a conventional BCMI design (one based upon the use of only EEG features) and demonstrate that the hybrid design results in a significant ( p<0.01p<0.01p<0.01 ) 6.2 percent improvement in performance for arousal classification and a significant ( p<0.01p<0.01p<0.01 ) 5.9 percent improvement for valence classification

    A Wireless Future: performance art, interaction and the brain-computer interfaces

    Get PDF
    Although the use of Brain-Computer Interfaces (BCIs) in the arts originates in the 1960s, there is a limited number of known applications in the context of real-time audio-visual and mixed-media performances and accordingly the knowledge base of this area has not been developed sufficiently. Among the reasons are the difficulties and the unknown parameters involved in the design and implementation of the BCIs. However today, with the dissemination of the new wireless devices, the field is rapidly growing and changing. In this frame, we examine a selection of representative works and artists, in comparison to the current scientific evidence. We identify important performative and neuroscientific aspects, issues and challenges. A model of possible interactions between the performers and the audience is discussed and future trends regarding liveness and interconnectivity are suggested

    Emotion-Inducing Imagery versus Motor Imagery for a Brain-Computer Interface

    Get PDF

    A multiplex connectivity map of valence-arousal emotional model

    Get PDF
    high number of studies have already demonstrated an electroencephalography (EEG)-based emotion recognition system with moderate results. Emotions are classified into discrete and dimensional models. We focused on the latter that incorporates valence and arousal dimensions. The mainstream methodology is the extraction of univariate measures derived from EEG activity from various frequencies classifying trials into low/high valence and arousal levels. Here, we evaluated brain connectivity within and between brain frequencies under the multiplexity framework. We analyzed an EEG database called DEAP that contains EEG responses to video stimuli and users’ emotional self-assessments. We adopted a dynamic functional connectivity analysis under the notion of our dominant coupling model (DoCM). DoCM detects the dominant coupling mode per pair of EEG sensors, which can be either within frequencies coupling (intra) or between frequencies coupling (cross-frequency). DoCM revealed an integrated dynamic functional connectivity graph (IDFCG) that keeps both the strength and the preferred dominant coupling mode. We aimed to create a connectomic mapping of valence-arousal map via employing features derive from IDFCG. Our results outperformed previous findings succeeding to predict in a high accuracy participants’ ratings in valence and arousal dimensions based on a flexibility index of dominant coupling modes
    corecore