4,912 research outputs found

    Combining brain-computer interfaces and assistive technologies: state-of-the-art and challenges

    Get PDF
    In recent years, new research has brought the field of EEG-based Brain-Computer Interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely,“Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices

    Navigation and interaction in a real-scale digital mock-up using natural language and user gesture

    Get PDF
    This paper tries to demonstrate a very new real-scale 3D system and sum up some firsthand and cutting edge results concerning multi-modal navigation and interaction interfaces. This work is part of the CALLISTO-SARI collaborative project. It aims at constructing an immersive room, developing a set of software tools and some navigation/interaction interfaces. Two sets of interfaces will be introduced here: 1) interaction devices, 2) natural language (speech processing) and user gesture. The survey on this system using subjective observation (Simulator Sickness Questionnaire, SSQ) and objective measurements (Center of Gravity, COG) shows that using natural languages and gesture-based interfaces induced less cyber-sickness comparing to device-based interfaces. Therefore, gesture-based is more efficient than device-based interfaces.FUI CALLISTO-SAR

    Games and Brain-Computer Interfaces: The State of the Art

    Get PDF
    BCI gaming is a very young field; most games are proof-of-concepts. Work that compares BCIs in a game environments with traditional BCIs indicates no negative effects, or even a positive effect of the rich visual environments on the performance. The low transfer-rate of current games poses a problem for control of a game. This is often solved by changing the goal of the game. Multi-modal input with BCI forms an promising solution, as does assigning more meaningful functionality to BCI control

    Prefrontal cortex activation upon a demanding virtual hand-controlled task: A new frontier for neuroergonomics

    Get PDF
    open9noFunctional near-infrared spectroscopy (fNIRS) is a non-invasive vascular-based functional neuroimaging technology that can assess, simultaneously from multiple cortical areas, concentration changes in oxygenated-deoxygenated hemoglobin at the level of the cortical microcirculation blood vessels. fNIRS, with its high degree of ecological validity and its very limited requirement of physical constraints to subjects, could represent a valid tool for monitoring cortical responses in the research field of neuroergonomics. In virtual reality (VR) real situations can be replicated with greater control than those obtainable in the real world. Therefore, VR is the ideal setting where studies about neuroergonomics applications can be performed. The aim of the present study was to investigate, by a 20-channel fNIRS system, the dorsolateral/ventrolateral prefrontal cortex (DLPFC/VLPFC) in subjects while performing a demanding VR hand-controlled task (HCT). Considering the complexity of the HCT, its execution should require the attentional resources allocation and the integration of different executive functions. The HCT simulates the interaction with a real, remotely-driven, system operating in a critical environment. The hand movements were captured by a high spatial and temporal resolution 3-dimensional (3D) hand-sensing device, the LEAP motion controller, a gesture-based control interface that could be used in VR for tele-operated applications. Fifteen University students were asked to guide, with their right hand/forearm, a virtual ball (VB) over a virtual route (VROU) reproducing a 42 m narrow road including some critical points. The subjects tried to travel as long as possible without making VB fall. The distance traveled by the guided VB was 70.2 ± 37.2 m. The less skilled subjects failed several times in guiding the VB over the VROU. Nevertheless, a bilateral VLPFC activation, in response to the HCT execution, was observed in all the subjects. No correlation was found between the distance traveled by the guided VB and the corresponding cortical activation. These results confirm the suitability of fNIRS technology to objectively evaluate cortical hemodynamic changes occurring in VR environments. Future studies could give a contribution to a better understanding of the cognitive mechanisms underlying human performance either in expert or non-expert operators during the simulation of different demanding/fatiguing activities.openCarrieri, Marika; Petracca, Andrea; Lancia, Stefania; Basso Moro, Sara; Brigadoi, Sabrina; Spezialetti, Matteo; Ferrari, Marco; Placidi, Giuseppe; Quaresima, ValentinaCarrieri, Marika; Petracca, Andrea; Lancia, Stefania; BASSO MORO, Sara; Brigadoi, Sabrina; Spezialetti, Matteo; Ferrari, Marco; Placidi, Giuseppe; Quaresima, Valentin

    Detecting Spatial Orientation Demands during Virtual Navigation using EEG Brain Sensing

    Get PDF
    This study shows how brain sensing can offer insight to the evaluation of human spatial orientation in virtual reality (VR) and establish a role for electroencephalogram (EEG) in virtual navigation. Research suggests that the evaluation of spatial orientation in VR benefits by goingbeyond performance measures or questionnaires to measurements of the user’s cognitive state. While EEG has emerged as a practical brain sensing technology in cognitive research, spatial orientation tasks often rely on multiple factors (e.g., reference frame used, ability to update simulated rotation, and/or left-right confusion) which may be inaccessible to this measurement. EEG has been shown to correlate with human spatial orientation in previous research. In this paper, we use convolutional neural network (CNN), an advanced technique in machine learning, to train a detection model that can identify moments in which VR users experienced some increase in spatial orientation demands in real-time. Our results demonstrate that we can indeed use machine learning technique to detect such cognitive state of increasing spatial orientation demands in virtual reality research with 96% accurate on average

    Modelling human emotions using immersive virtual reality, physiological signals and behavioural responses

    Full text link
    Tesis por compendio[ES] El uso de la realidad virtual (RV) se ha incrementado notablemente en la comunidad científica para la investigación del comportamiento humano. En particular, la RV inmersiva ha crecido debido a la democratización de las gafas de realidad virtual o head mounted displays (HMD), que ofrecen un alto rendimiento con una inversión económica. Uno de los campos que ha emergido con fuerza en la última década es el Affective Computing, que combina psicofisiología, informática, ingeniería biomédica e inteligencia artificial, desarrollando sistemas que puedan reconocer emociones automáticamente. Su progreso es especialmente importante en el campo de la investigación del comportamiento humano, debido al papel fundamental que las emociones juegan en muchos procesos psicológicos como la percepción, la toma de decisiones, la creatividad, la memoria y la interacción social. Muchos estudios se han centrado en intentar obtener una metodología fiable para evocar y automáticamente identificar estados emocionales, usando medidas fisiológicas objetivas y métodos de aprendizaje automático. Sin embargo, la mayoría de los estudios previos utilizan imágenes, audios o vídeos para generar los estados emocionales y, hasta donde llega nuestro conocimiento, ninguno de ellos ha desarrollado un sistema de reconocimiento emocional usando RV inmersiva. Aunque algunos trabajos anteriores sí analizan las respuestas fisiológicas en RV inmersivas, estos no presentan modelos de aprendizaje automático para procesamiento y clasificación automática de bioseñales. Además, un concepto crucial cuando se usa la RV en investigación del comportamiento humano es la validez: la capacidad de evocar respuestas similares en un entorno virtual a las evocadas por el espacio físico. Aunque algunos estudios previos han usado dimensiones psicológicas y cognitivas para comparar respuestas entre entornos reales y virtuales, las investigaciones que analizan respuestas fisiológicas o comportamentales están mucho menos extendidas. Según nuestros conocimientos, este es el primer trabajo que compara entornos físicos con su réplica en RV, empleando respuestas fisiológicas y algoritmos de aprendizaje automático y analizando la capacidad de la RV de transferir y extrapolar las conclusiones obtenidas al entorno real que se está simulando. El objetivo principal de la tesis es validar el uso de la RV inmersiva como una herramienta de estimulación emocional usando respuestas psicofisiológicas y comportamentales en combinación con algoritmos de aprendizaje automático, así como realizar una comparación directa entre un entorno real y virtual. Para ello, se ha desarrollado un protocolo experimental que incluye entornos emocionales 360º, un museo real y una virtualización 3D altamente realista del mismo museo. La tesis presenta novedosas contribuciones del uso de la RV inmersiva en la investigación del comportamiento humano, en particular en lo relativo al estudio de las emociones. Esta ayudará a aplicar metodologías a estímulos más realistas para evaluar entornos y situaciones de la vida diaria, superando las actuales limitaciones de la estimulación emocional que clásicamente ha incluido imágenes, audios o vídeos. Además, en ella se analiza la validez de la RV realizando una comparación directa usando una simulación altamente realista. Creemos que la RV inmersiva va a revolucionar los métodos de estimulación emocional en entornos de laboratorio. Además, su sinergia junto a las medidas fisiológicas y las técnicas de aprendizaje automático, impactarán transversalmente en muchas áreas de investigación como la arquitectura, la salud, la evaluación psicológica, el entrenamiento, la educación, la conducción o el marketing, abriendo un nuevo horizonte de oportunidades para la comunidad científica. La presente tesis espera contribuir a caminar en esa senda.[EN] In recent years the scientific community has significantly increased its use of virtual reality (VR) technologies in human behaviour research. In particular, the use of immersive VR has grown due to the introduction of affordable, high performance head mounted displays (HMDs). Among the fields that has strongly emerged in the last decade is affective computing, which combines psychophysiology, computer science, biomedical engineering and artificial intelligence in the development of systems that can automatically recognize emotions. The progress of affective computing is especially important in human behaviour research due to the central role that emotions play in many background processes, such as perception, decision-making, creativity, memory and social interaction. Several studies have tried to develop a reliable methodology to evoke and automatically identify emotional states using objective physiological measures and machine learning methods. However, the majority of previous studies used images, audio or video to elicit emotional statements; to the best of our knowledge, no previous research has developed an emotion recognition system using immersive VR. Although some previous studies analysed physiological responses in immersive VR, they did not use machine learning techniques for biosignal processing and classification. Moreover, a crucial concept when using VR for human behaviour research is validity: the capacity to evoke a response from the user in a simulated environment similar to the response that might be evoked in a physical environment. Although some previous studies have used psychological and cognitive dimensions to compare responses in real and virtual environments, few have extended this research to analyse physiological or behavioural responses. Moreover, to our knowledge, this is the first study to compare VR scenarios with their real-world equivalents using physiological measures coupled with machine learning algorithms, and to analyse the ability of VR to transfer and extrapolate insights obtained from VR environments to real environments. The main objective of this thesis is, using psycho-physiological and behavioural responses in combination with machine learning methods, and by performing a direct comparison between a real and virtual environment, to validate immersive VR as an emotion elicitation tool. To do so we develop an experimental protocol involving emotional 360º environments, an art exhibition in a real museum, and a highly-realistic 3D virtualization of the same art exhibition. This thesis provides novel contributions to the use of immersive VR in human behaviour research, particularly in relation to emotions. VR can help in the application of methodologies designed to present more realistic stimuli in the assessment of daily-life environments and situations, thus overcoming the current limitations of affective elicitation, which classically uses images, audio and video. Moreover, it analyses the validity of VR by performing a direct comparison using highly-realistic simulation. We believe that immersive VR will revolutionize laboratory-based emotion elicitation methods. Moreover, its synergy with physiological measurement and machine learning techniques will impact transversely in many other research areas, such as architecture, health, assessment, training, education, driving and marketing, and thus open new opportunities for the scientific community. The present dissertation aims to contribute to this progress.[CA] L'ús de la realitat virtual (RV) s'ha incrementat notablement en la comunitat científica per a la recerca del comportament humà. En particular, la RV immersiva ha crescut a causa de la democratització de les ulleres de realitat virtual o head mounted displays (HMD), que ofereixen un alt rendiment amb una reduïda inversió econòmica. Un dels camps que ha emergit amb força en l'última dècada és el Affective Computing, que combina psicofisiologia, informàtica, enginyeria biomèdica i intel·ligència artificial, desenvolupant sistemes que puguen reconéixer emocions automàticament. El seu progrés és especialment important en el camp de la recerca del comportament humà, a causa del paper fonamental que les emocions juguen en molts processos psicològics com la percepció, la presa de decisions, la creativitat, la memòria i la interacció social. Molts estudis s'han centrat en intentar obtenir una metodologia fiable per a evocar i automàticament identificar estats emocionals, utilitzant mesures fisiològiques objectives i mètodes d'aprenentatge automàtic. No obstant això, la major part dels estudis previs utilitzen imatges, àudios o vídeos per a generar els estats emocionals i, fins on arriba el nostre coneixement, cap d'ells ha desenvolupat un sistema de reconeixement emocional mitjançant l'ús de la RV immersiva. Encara que alguns treballs anteriors sí que analitzen les respostes fisiològiques en RV immersives, aquests no presenten models d'aprenentatge automàtic per a processament i classificació automàtica de biosenyals. A més, un concepte crucial quan s'utilitza la RV en la recerca del comportament humà és la validesa: la capacitat d'evocar respostes similars en un entorn virtual a les evocades per l'espai físic. Encara que alguns estudis previs han utilitzat dimensions psicològiques i cognitives per a comparar respostes entre entorns reals i virtuals, les recerques que analitzen respostes fisiològiques o comportamentals estan molt menys esteses. Segons els nostres coneixements, aquest és el primer treball que compara entorns físics amb la seua rèplica en RV, emprant respostes fisiològiques i algorismes d'aprenentatge automàtic i analitzant la capacitat de la RV de transferir i extrapolar les conclusions obtingudes a l'entorn real que s'està simulant. L'objectiu principal de la tesi és validar l'ús de la RV immersiva com una eina d'estimulació emocional usant respostes psicofisiològiques i comportamentals en combinació amb algorismes d'aprenentatge automàtic, així com realitzar una comparació directa entre un entorn real i virtual. Per a això, s'ha desenvolupat un protocol experimental que inclou entorns emocionals 360º, un museu real i una virtualització 3D altament realista del mateix museu. La tesi presenta noves contribucions de l'ús de la RV immersiva en la recerca del comportament humà, en particular quant a l'estudi de les emocions. Aquesta ajudarà a aplicar metodologies a estímuls més realistes per a avaluar entorns i situacions de la vida diària, superant les actuals limitacions de l'estimulació emocional que clàssicament ha inclòs imatges, àudios o vídeos. A més, en ella s'analitza la validesa de la RV realitzant una comparació directa usant una simulació altament realista. Creiem que la RV immersiva revolucionarà els mètodes d'estimulació emocional en entorns de laboratori. A més, la seua sinergia al costat de les mesures fisiològiques i les tècniques d'aprenentatge automàtic, impactaran transversalment en moltes àrees de recerca com l'arquitectura, la salut, l'avaluació psicològica, l'entrenament, l'educació, la conducció o el màrqueting, obrint un nou horitzó d'oportunitats per a la comunitat científica. La present tesi espera contribuir a caminar en aquesta senda.Marín Morales, J. (2020). Modelling human emotions using immersive virtual reality, physiological signals and behavioural responses [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/148717TESISCompendi

    Research in Neuroscience and Virtual Reality

    Get PDF

    Assessing brain activations associated with emotional regulation during virtual reality mood induction procedures

    Full text link
    Emotional regulation strategies are used by people to influence their emotional responses to external or internal emotional stimuli. The aim of this study is to evaluate the brain activations that are associated with the application of two different emotional regulation strategies (cognitive reappraisal and expressive suppression) during virtual reality mood induction procedures. We used Emotiv EPOC to measure the brain electrical activity of participants while sadness is induced using a virtual reality environment. We monitored 24 participants, who were distributed among three experimental groups: a control group, a cognitive reappraisal group and an expressive suppression group. In the control group, we found significant activations in several right frontal regions that are related to the induction of negative emotions . We also found significant activations in the limbic, occipital, and parietal regions in the emotional regulation groups. These regions are related to the application of emotional regulation strategies. The results are consistent with those shown in the literature, which were obtained through clinical neuroimaging systems.The work of A. Rodriguez was funded by the Spanish MEC under an FPI Grant BES-2011-043316. The work of Miriam Clemente was funded by the Generalitat Valenciana under a VALi+d Grant.Rodríguez Ortega, A.; Rey, B.; Clemente Bellido, M.; Wrzesien, M.; Alcañiz Raya, ML. (2015). Assessing brain activations associated with emotional regulation during virtual reality mood induction procedures. Expert Systems with Applications. 42(3):1699-1709. https://doi.org/10.1016/j.eswa.2014.10.006S1699170942

    Navigation in Real-World Environments: New Opportunities Afforded by Advances in Mobile Brain Imaging

    Get PDF
    A central question in neuroscience and psychology is how the mammalian brain represents the outside world and enables interaction with it. Significant progress on this question has been made in the domain of spatial cognition, where a consistent network of brain regions that represent external space has been identified in both humans and rodents. In rodents, much of the work to date has been done in situations where the animal is free to move about naturally. By contrast, the majority of work carried out to date in humans is static, due to limitations imposed by traditional laboratory based imaging techniques. In recent years, significant progress has been made in bridging the gap between animal and human work by employing virtual reality (VR) technology to simulate aspects of real-world navigation. Despite this progress, the VR studies often fail to fully simulate important aspects of real-world navigation, where information derived from self-motion is integrated with representations of environmental features and task goals. In the current review article, we provide a brief overview of animal and human imaging work to date, focusing on commonalties and differences in findings across species. Following on from this we discuss VR studies of spatial cognition, outlining limitations and developments, before introducing mobile brain imaging techniques and describe technical challenges and solutions for real-world recording. Finally, we discuss how these advances in mobile brain imaging technology, provide an unprecedented opportunity to illuminate how the brain represents complex multifaceted information during naturalistic navigation
    corecore