23 research outputs found

    Investigating the Impact of a Dual Musical Brain-Computer Interface on Interpersonal Synchrony: A Pilot Study

    Full text link
    This study looked into how effective a Musical Brain-Computer Interface (MBCI) can be in providing feedback about synchrony between two people. Using a double EEG setup, we compared two types of musical feedback; one that adapted in real-time based on the inter-brain synchrony between participants (Neuroadaptive condition), and another music that was randomly generated (Random condition). We evaluated how these two conditions were perceived by 8 dyads (n = 16) and whether the generated music could influence the perceived connection and EEG synchrony between them. The findings indicated that Neuroadaptive musical feedback could potentially boost synchrony levels between people compared to Random feedback, as seen by a significant increase in EEG phase-locking values. Additionally, the real-time measurement of synchrony was successfully validated and musical neurofeedback was generally well-received by the participants. However, more research is needed for conclusive results due to the small sample size. This study is a stepping stone towards creating music that can audibly reflect the level of synchrony between individuals.Comment: 6 pages, 4 figure

    Supporting the Experience of Stakeholders of Multimedia Art – Towards an Ontology

    Get PDF
    Part 1: Beyond Computers: Wearables, Humans, and Things - WHAT!International audienceWe introduce the rapid change of the visual art ecosystem, triggered by current science and technology development. ICT enables new multimedia based an interactive art forms, with an increasing variety of stakeholders. We provide examples of audience involvement, of immersion, and of brain-computer interaction as a new paradigm for participation. We point to the use of new material dimensions, as well as to expanding shared creation and cognition. We also point to opportunities to apply this development to accommodate special needs. In order to support the dissemination of these possibilities, we advocate the development of a task-modeling based ontology to describe, analyse, and support the evolving art ecosystem

    GROUPTHINK: Telepresence and Agency During Live Performance

    Get PDF
    Live performers often describe "playing to the audience" as shifts in emphasis, timing and even content according to perceived audience reactions. Traditional staging allows the transmission of physiological signals through the audience's eyes, skin, odor, breathing, vocalizations and motions such as dancing, stamping and clapping, some of which are audible. The Internet and other mass media broaden access to live performance, but they efface traditional channels for "liveness," which we specify as physiological feedback loops that bind performers and audience through shared agency. During online events, contemporary performers enjoy text and icon-based feedback, but current technology limits expression of physiological reactions by remote audiences. Looking to a future Internet of Neurons where humans and AI co-create via neurophysiological interfaces, this paper examines the possibility of reestablishing audience agency during live performance by using hemodynamic sensors while exploring the potential of AI as a creative collaborator

    Brain Machine Interfaces and Ethics: A Transition from Wearable to Implantable

    Get PDF

    Anticipation in architectural experience: a computational neurophenomenology for architecture?

    Get PDF
    The perceptual experience of architecture is enacted by the sensory and motor system. When we act, we change the perceived environment according to a set of expectations that depend on our body and the built environment. The continuous process of collecting sensory information is thus based on bodily affordances. Affordances characterize the fit between the physical structure of the body and capacities for movement in the built environment. Since little has been done regarding the role of architectural design in the emergence of perceptual experience on a neuronal level, this paper offers a first step towards the role of architectural design in perceptual experience. An approach to synthesize concepts from computational neuroscience with architectural phenomenology into a computational neurophenomenology is considered. The outcome is a framework under which studies of architecture and cognitive neuroscience can be cast.Comment: 1 title-page, 23 pages, 5 figure

    Corseto: A Kinesthetic Garment for Designing, Composing for, and Experiencing an Intersubjective Haptic Voice

    Get PDF
    We present a novel intercorporeal experience - an intersubjective haptic voice. Through an autobiographical design inquiry, based on singing techniques from the classical opera tradition, we created Corsetto, a kinesthetic garment for transferring somatic reminiscents of vocal experience from an expert singer to a listener. We then composed haptic gestures enacted in the Corsetto, emulating upper-body movements of the live singer performing a piece by Morton Feldman named Three Voices. The gestures in the Corsetto added a haptics-based \u27fourth voice\u27 to the immersive opera performance. Finally, we invited audiences who were asked to wear Corsetto during live performances. Afterwards they engaged in micro-phenomenological interviews. The analysis revealed how the Corsetto managed to bridge inner and outer bodily sensations, creating a feeling of a shared intercorporeal experience, dissolving boundaries between listener, singer and performance. We propose that \u27intersubjective haptics\u27 can be a generative medium not only for singing performances, but other possible intersubjective experiences
    corecore