200 research outputs found

    From a biosignal to xenotext : the affective dimension of textuality in postdigital art projects

    Get PDF
    With reference to the categories of aff ectivity and intentionality, the Author considers some of the various research perspectives that can be brought to bear upon the category of literariness in biotextual projects. She therefore introduces the concepts of "technotext" (Hayles), "physio-cybertext" and "biopoetry" (Kac), and "partly non-discursive aff ectivity" (Knudsen and Stage). The author primarily considers the role of non-human actors in constructing biotextual projects; this includes bacteria and other living cells that display the kinds of goal-oriented behawior (or intentionality) that bring about causal changes in biotextual works. Moreover, non-human actors are considered to be a physiological, aff ective force capable of altering the physical shape of such works. Introducing her own concept of "inside-body actors" (meaning the functioning of the body's organs, hormones and other biochemical changes in the organism), the Author demonstrates how these "actors" are crucial to the medium. Her article presents three examples of (trans)literary works that were created in a corporal, aff ective and biological context: The Breathing Wall by Kate Pullinger (with Stefan Schemat and Chris Joseph); Diane Gromala's BioMorphic Typography (part of a larger scientifi c and artistic initiative entitled "Design for the Senses"); and Christian Bök's Xenotext. This last example is one of the most recent works to combine digital text with the biological functioning of microorganisms in a constantly evolving process

    Designing and evaluating avatar biosignal visualization techniques in social virtual reality

    Get PDF
    Social VR is the application of virtual reality that supports remote social interaction in virtual spaces. Users communicate and interact with others in the social VR environment through avatars, which are virtual anthropomorphic characters that aim to represent humans in virtual worlds. In addition, the development of the HMD and commercially available motion capture systems enable the avatars in the virtual environment to detect and reflect the real-time motions, even facial expressions of people. However, the avatars still lack an indication of biofeedback - e.g., body temperature, breathing, heart rate, muscle contraction -, which serves as social cues for communication in reality. While some features, for example, emojis, supports users to express their feeling or emotions for richer communication, the missing information often results in miscommunication in the virtual space. It remains a barrier to a fully immersed experience in the social VR space. This project proposes a concept of visualizing biosignals of the avatars in the social virtual reality space for a richer-level interaction in virtual reality. With the technologies available to capture and reflect accurate biofeedback in real-time, we would like to explore ways and possibilities to map the bio states of the users in reality to avatars in the virtual world. The project starts with conducting user researches to understand the current user behaviors in the social VR spaces and their perspectives on sharing biosignals. Based on the requirements gathered from the user study, the scope of the project is narrowed down to a ‘watching entertainment’ scenario, and the ways to visualize biosignals on avatars were explored through a co-design session with designers. After that, four biosignal visualization techniques in two biosignals - heart rate and breathing rate - are prototyped under the VR jazz bar setting. Finally, the user study is conducted with 16 pairs (32 participants in total) to test and compare the effects of each biosignal visualization technique in watching entertainment scenarios with a companion. As a result, the embodied visualizations are the most understandable and least distracting visualization method among the four methods. Furthermore, the limitations of the research, recommendations on biosignal visualizations, and recommendations on conducting design research are provided

    Biomedical Signal Analysis of the Brain and Systemic Physiology

    Full text link
    Near-infrared spectroscopy (NIRS) is a non-invasive and easy-to-use diagnostic technique that enables real-time tissue oxygenation measurements applied in various contexts and for different purposes. Continuous monitoring with NIRS of brain oxygenation, for example, in neonatal intensive care units (NICUs), is essential to prevent lifelong disabilities in newborns. Moreover, NIRS can be applied to observe brain activity associated with hemodynamic changes in blood flow due to neurovascular coupling. In the latter case, NIRS contributes to studying cognitive processes allowing to conduct experiments in natural and socially interactive contexts of everyday life. However, it is essential to measure systemic physiology and NIRS signals concurrently. The combination of brain and body signals enables to build sophisticated systems that, for example, reduce the false alarms that occur in NICUs. Furthermore, since fNIRS signals are influenced by systemic physiology, it is essential to understand how the latter impacts brain signals in functional studies. There is an interesting brain body coupling that has rarely been investigated yet. To take full advantage of these brain and body data, the aim of this thesis was to develop novel approaches to analyze these biosignals to extract the information and identify new patterns, to solve different research or clinical questions. For this the development of new methodological approaches and sophisticated data analysis is necessary, because often the identification of these patterns is challenging or not possible with traditional methods. In such cases, automatic machine learning (ML) techniques are beneficial. The first contribution of this work was to assess the known systemic physiology augmented (f)NIRS approach for clinical use and in everyday life. Based on physiological and NIRS signals of preterm infants, an ML-based classification system has been realized, able to reduce the false alarms in NICUs by providing a high sensitivity rate. In addition, the SPA-fNIRS approach was further applied in adults during a breathing task. The second contribution of this work was the advancement of the classical fNIRS hyperscanning method by adding systemic physiology measures. For this, new biosignal analyses in the time-frequency domain have been developed and tested in a simple nonverbal synchrony task between pairs of subjects. Furthermore, based on SPA-fNIRS hyperscanning data, another ML-based system was created, which is able distinguish familiar and unfamiliar pairs with high accuracy. This approach enables to determine the strength of social bonds in a wide range of social interaction contexts. In conclusion, we were the first group to perform a SPA-fNIRS hyperscanning study capturing changes in cerebral oxygenation and hemodynamics as well as systemic physiology in two subjects simultaneously. We applied new biosignals analysis methods enabling new insights into the study of social interactions. This work opens the door to many future inter-subjects fNIRS studies with the benefit of assessing the brain-to-brain, the brain-to-body, and body-to-body coupling between pairs of subjects

    NON-VERBAL COMMUNICATION WITH PHYSIOLOGICAL SENSORS. THE AESTHETIC DOMAIN OF WEARABLES AND NEURAL NETWORKS

    Get PDF
    Historically, communication implies the transfer of information between bodies, yet this phenomenon is constantly adapting to new technological and cultural standards. In a digital context, it’s commonplace to envision systems that revolve around verbal modalities. However, behavioural analysis grounded in psychology research calls attention to the emotional information disclosed by non-verbal social cues, in particular, actions that are involuntary. This notion has circulated heavily into various interdisciplinary computing research fields, from which multiple studies have arisen, correlating non-verbal activity to socio-affective inferences. These are often derived from some form of motion capture and other wearable sensors, measuring the ‘invisible’ bioelectrical changes that occur from inside the body. This thesis proposes a motivation and methodology for using physiological sensory data as an expressive resource for technology-mediated interactions. Initialised from a thorough discussion on state-of-the-art technologies and established design principles regarding this topic, then applied to a novel approach alongside a selection of practice works to compliment this. We advocate for aesthetic experience, experimenting with abstract representations. Atypically from prevailing Affective Computing systems, the intention is not to infer or classify emotion but rather to create new opportunities for rich gestural exchange, unconfined to the verbal domain. Given the preliminary proposition of non-representation, we justify a correspondence with modern Machine Learning and multimedia interaction strategies, applying an iterative, human-centred approach to improve personalisation without the compromising emotional potential of bodily gesture. Where related studies in the past have successfully provoked strong design concepts through innovative fabrications, these are typically limited to simple linear, one-to-one mappings and often neglect multi-user environments; we foresee a vast potential. In our use cases, we adopt neural network architectures to generate highly granular biofeedback from low-dimensional input data. We present the following proof-of-concepts: Breathing Correspondence, a wearable biofeedback system inspired by Somaesthetic design principles; Latent Steps, a real-time auto-encoder to represent bodily experiences from sensor data, designed for dance performance; and Anti-Social Distancing Ensemble, an installation for public space interventions, analysing physical distance to generate a collective soundscape. Key findings are extracted from the individual reports to formulate an extensive technical and theoretical framework around this topic. The projects first aim to embrace some alternative perspectives already established within Affective Computing research. From here, these concepts evolve deeper, bridging theories from contemporary creative and technical practices with the advancement of biomedical technologies.Historicamente, os processos de comunicação implicam a transferência de informação entre organismos, mas este fenómeno está constantemente a adaptar-se a novos padrões tecnológicos e culturais. Num contexto digital, é comum encontrar sistemas que giram em torno de modalidades verbais. Contudo, a análise comportamental fundamentada na investigação psicológica chama a atenção para a informação emocional revelada por sinais sociais não verbais, em particular, acções que são involuntárias. Esta noção circulou fortemente em vários campos interdisciplinares de investigação na área das ciências da computação, dos quais surgiram múltiplos estudos, correlacionando a actividade nãoverbal com inferências sócio-afectivas. Estes são frequentemente derivados de alguma forma de captura de movimento e sensores “wearable”, medindo as alterações bioeléctricas “invisíveis” que ocorrem no interior do corpo. Nesta tese, propomos uma motivação e metodologia para a utilização de dados sensoriais fisiológicos como um recurso expressivo para interacções mediadas pela tecnologia. Iniciada a partir de uma discussão aprofundada sobre tecnologias de ponta e princípios de concepção estabelecidos relativamente a este tópico, depois aplicada a uma nova abordagem, juntamente com uma selecção de trabalhos práticos, para complementar esta. Defendemos a experiência estética, experimentando com representações abstractas. Contrariamente aos sistemas de Computação Afectiva predominantes, a intenção não é inferir ou classificar a emoção, mas sim criar novas oportunidades para uma rica troca gestual, não confinada ao domínio verbal. Dada a proposta preliminar de não representação, justificamos uma correspondência com estratégias modernas de Machine Learning e interacção multimédia, aplicando uma abordagem iterativa e centrada no ser humano para melhorar a personalização sem o potencial emocional comprometedor do gesto corporal. Nos casos em que estudos anteriores demonstraram com sucesso conceitos de design fortes através de fabricações inovadoras, estes limitam-se tipicamente a simples mapeamentos lineares, um-para-um, e muitas vezes negligenciam ambientes multi-utilizadores; com este trabalho, prevemos um potencial alargado. Nos nossos casos de utilização, adoptamos arquitecturas de redes neurais para gerar biofeedback altamente granular a partir de dados de entrada de baixa dimensão. Apresentamos as seguintes provas de conceitos: Breathing Correspondence, um sistema de biofeedback wearable inspirado nos princípios de design somaestético; Latent Steps, um modelo autoencoder em tempo real para representar experiências corporais a partir de dados de sensores, concebido para desempenho de dança; e Anti-Social Distancing Ensemble, uma instalação para intervenções no espaço público, analisando a distância física para gerar uma paisagem sonora colectiva. Os principais resultados são extraídos dos relatórios individuais, para formular um quadro técnico e teórico alargado para expandir sobre este tópico. Os projectos têm como primeiro objectivo abraçar algumas perspectivas alternativas às que já estão estabelecidas no âmbito da investigação da Computação Afectiva. A partir daqui, estes conceitos evoluem mais profundamente, fazendo a ponte entre as teorias das práticas criativas e técnicas contemporâneas com o avanço das tecnologias biomédicas

    Live Biofeedback as a User Interface Design Element: A Review of the Literature

    Get PDF
    With the advances in sensor technology and real-time processing of neurophysiological data, a growing body of academic literature has begun to explore how live biofeedback can be integrated into information systems for everyday use. While researchers have traditionally studied live biofeedback in the clinical domain, the proliferation of affordable mobile sensor technology enables researchers and practitioners to consider live biofeedback as a user interface element in contexts such as decision support, education, and gaming. In order to establish the current state of research on live biofeedback, we conducted a literature review on studies that examine self and foreign live biofeedback based on neurophysiological data for healthy subjects in an information systems context. By integrating a body of highly fragmented work from computer science, engineering and technology, information systems, medical science, and psychology, this paper synthesizes results from existing research, identifies knowledge gaps, and suggests directions for future research. In this vein, this review can serve as a reference guide for researchers and practitioners on how to integrate self and foreign live biofeedback into information systems for everyday use

    Shared User Interfaces of Physiological Data: Systematic Review of Social Biofeedback Systems and Contexts in HCI

    Get PDF
    As an emerging interaction paradigm, physiological computing is increasingly being used to both measure and feed back information about our internal psychophysiological states. While most applications of physiological computing are designed for individual use, recent research has explored how biofeedback can be socially shared between multiple users to augment human-human communication. Reflecting on the empirical progress in this area of study, this paper presents a systematic review of 64 studies to characterize the interaction contexts and effects of social biofeedback systems. Our findings highlight the importance of physio-temporal and social contextual factors surrounding physiological data sharing as well as how it can promote social-emotional competences on three different levels: intrapersonal, interpersonal, and task-focused. We also present the Social Biofeedback Interactions framework to articulate the current physiological-social interaction space. We use this to frame our discussion of the implications and ethical considerations for future research and design of social biofeedback interfaces.Comment: [Accepted version, 32 pages] Clara Moge, Katherine Wang, and Youngjun Cho. 2022. Shared User Interfaces of Physiological Data: Systematic Review of Social Biofeedback Systems and Contexts in HCI. In CHI Conference on Human Factors in Computing Systems (CHI'22), ACM, https://doi.org/10.1145/3491102.351749

    In Forgotten Daydreams: Performing in Biosignal-Generated Visualizations

    Get PDF
    This research project is a performance in an interactive installation that projects the live human biological signals into narrative visualizations. It aims to cultivate consciousness of the participants to daydream about the unlimited prospects for their bodies and the world. The objective of this master thesis is to explore the realm of art and technology in mixed realities by combining craftsmanship and narrative visualizations. It aims to unpack an immersive and interactive hybrid space for daydreaming and to influence everyday life experiences

    Clinical Effects of Immersive Multimodal BCI-VR Training after Bilateral Neuromodulation with rTMS on Upper Limb Motor Recovery after Stroke. A Study Protocol for a Randomized Controlled Trial.

    Get PDF
    Background and Objectives: The motor sequelae after a stroke are frequently persistent and cause a high degree of disability. Cortical ischemic or hemorrhagic strokes affecting the corticospinal pathways are known to cause a reduction of cortical excitability in the lesioned area not only for the local connectivity impairment but also due to a contralateral hemisphere inhibitory action. Non-invasive brain stimulation using high frequency repetitive magnetic transcranial stimulation (rTMS) over the lesioned hemisphere and contralateral cortical inhibition using low-frequency rTMS have been shown to increase the excitability of the lesioned hemisphere. Mental representation techniques, neurofeedback, and virtual reality have also been shown to increase cortical excitability and complement conventional rehabilitation. Materials and Methods: We aim to carry out a single-blind, randomized, controlled trial aiming to study the efficacy of immersive multimodal Brain–Computer Interfacing-Virtual Reality (BCI-VR) training after bilateral neuromodulation with rTMS on upper limb motor recovery after subacute stroke (>3 months) compared to neuromodulation combined with conventional motor imagery tasks. This study will include 42 subjects in a randomized controlled trial design. The main expected outcomes are changes in the Motricity Index of the Arm (MI), dynamometry of the upper limb, score according to Fugl-Meyer for upper limb (FMA-UE), and changes in the Stroke Impact Scale (SIS). The evaluation will be carried out before the intervention, after each intervention and 15 days after the last session. Conclusions: This trial will show the additive value of VR immersive motor imagery as an adjuvant therapy combined with a known effective neuromodulation approach opening new perspectives for clinical rehabilitation protocols.post-print966 K

    Intersubjectivity and cooperation in synchronous computer-mediated interaction

    Get PDF
    Interpersonal communication depends on a variety of signals, which humans have evolved to observe and understand in face-to-face interaction. Direct social perception and primary intersubjectivity refer to the ability to process non-verbal signals about the mental states of other people quickly and automatically, as a way to sort of co-experience them. Mediated communication is often necessary and even preferred due to different reasons, such as distance, time, or the use of technological tools, but currently there is no match for physical presence when it comes to engaging the mechanisms of primary intersubjectivity for understanding emotion and intent. This thesis is about the development of digital tools for improving collaboration in synchronous computer-mediated communication. While exploring multimodal technologies for communication, a lack of quantifiable measures for assessing the effects of prototypes on cooperation was identified. To this end, collaborative performance tasks were developed for a series of studies and used in order to investigate cooperation in computer-mediated interaction. In Study I, a joint coordination task in the form of a collaborative car racing game was used to study inter-brain synchronization without physical presence. The study revealed that EEG synchonization happens in the online gaming context. EEG synchronization was connected to task performance both momentarily, with higher gamma synchrony occurring during better performance, and pairwise, with higher overall alpha synchrony among high-performing pairs. Study II details the development and testing of a collaborative block design task, which was created to assess pair performance in social virtual reality (VR). The task can also be replicated in face-to-face interaction, making it possible to compare effects in these two domains. Individual visuospatial intelligence was identified as a factor that needs to be controlled when using the collaborative block design task. In Study III, a mouse controlled visual guidance task was developed to study the application of heart rate sharing in online chat box customer service. Combining quantitative and qualitative data, the study revealed that biosignal sharing might not be an optimal strategy for supporting cooperation and presence in customer service text chat. Taken together, the results suggest that both cooperative task performance and inter-brain synchronization can be meaningful measures in the development of synchronous computer-mediated communication. A final task, SynchroMouse is presented as a possible activity for increasing inter-brain and interpersonal synchronization online.Tämä väitöskirja koskee läsnäolon ja yhteistyön tukemista samanaikaisessa verkon yli tapahtuvassa vuorovaikutuksessa, ja menetelmiä yhteistyön mittaamiseksi. Yhteistyötehtäviä kehitettiin tutkimussarjassa, joka tutkii yhteistoimintaa tietokonevälitteisessä kommunikaatiossa. Verkossa tapahtuva ihmistenvälinen vuorovaikutus on hyvin suosittua, ja sen käyttö on lisääntynyt viime vuosina. Sitä käytetään yhteistyöhön ja sosiaalisten suhteiden ylläpitoon. Meidän sosiaaliset kykymme ovat kuitenkin kehittyneet kasvokkain tapahtuvassa vuorovaikutuksessa, ja verkossa pienet eleet, ilmeet ja ihmisten välinen tahdistuminen, joiden avulla ymmärrämme toistemme tunteita ja aikeita, jäävät usein vähemmälle huomiolle. Ensimmäinen osatutkimus osoitti, että aivojen välistä tahdistumista tapahtuu kun ihmiset pelaavat verkkopeliä ilman fyysistä läsnäoloa, ja että tahdistuminen on yhteydessä yhteistyössä suoriutumiseen. Toisessa osatutkimuksessa kehitettiin menetelmää, jolla voidaan arvioida yhteistyötä virtuaalitodellisuudessa, ja vertailla sitä kasvokkain tapahtuvaan yhteistyöhön. Kolmas osatutkimus tutki sykeinformaation käyttämistä kommunikaatiossa, ja osoitti sen olevan haitallista yhteistyötehtävässä suoriutumisen suhteen. Kokonaisuudessaan väitöstutkimus osoittaa, että verkkovuorovaikutusta voidaan kehittää yhteistyötehtävillä. Aivojen välisen tahdistumisen lisääminen voi myös olla keino parantaa läsnäolon tunnetta ja empatiaa verkossa. Lopuksi esitellään moninpelitehtävä, SynchroMouse, jonka on tarkoitus lisätä tahdistumista etänä
    corecore