2,708 research outputs found

    Assessment of the influence of navigation control and screen size on the sense of presence in virtual reality using EEG

    Full text link
    [EN] In the virtual reality field, presence refers to the sense of being there in the virtual world. Our aim in this work is to evaluate the usefulness of the Emotiv EPOC EEG device to measure brain activations due to the sense of presence in a virtual environment (VE), using for the analysis the sLORETA tool. We compare between three experimental conditions: photographs, video and free navigation through a VE. We also compare the differences in the sense of presence due to the visualization of the VE in different screens: a common desktop screen and a high-resolution power wall screen. We monitored 20 healthy subjects, obtaining significant differences between the navigation and video conditions in the activity of the right Insula for the Theta band. We also found a higher activation of the Insula for the Alpha and Theta bands while navigating, when comparing the two screen types. The Insula activation is related to stimulus attention and self-awareness processes, directly related with the sense of presence.This study was funded by Vicerrectorado de Investigacion de la Universitat Politecnica de Valencia, Spain, PAID-06-2011, R.N. 1984; by Ministerio de Educacion y Ciencia Spain, Project Game Teen (TIN2010-20187) and partially by projects Consolider-C (SEJ2006-14301/PSIC), "CIBER of Physiopathology of Obesity and Nutrition, an initiative of ISCIII", the Excellence Research Program PROMETEO (Generalitat Valenciana. Conselleria de Educacion, 2008-157) and the Consolider INGENIO program (CSD2007-00012). The work of Miriam Clemente was supported by the Generalitat Valenciana under a VALi+d Grant. The work of Alejandro Rodriguez was supported by the Spanish MEC under an FPI Grant BES-2011-043316.Clemente Bellido, M.; Rodríguez Ortega, A.; Rey, B.; Alcañiz Raya, ML. (2014). Assessment of the influence of navigation control and screen size on the sense of presence in virtual reality using EEG. Expert Systems with Applications. 41(4):1584-1592. https://doi.org/10.1016/j.eswa.2013.08.055S1584159241

    Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems

    Get PDF
    © 2017 IOP Publishing Ltd. Objective. This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. Approach. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Main results. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Significance. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the individual users. The proposed methods can be easily integrated in devising more advanced SC schemes and/or strategies for automatic BCI self-adaptations

    Brain-Computer Interfaces, Virtual Reality, and Videogames

    Get PDF
    Major challenges must be tackled for brain-computer interfaces to mature into an established communications medium for VR applications, which will range from basic neuroscience studies to developing optimal peripherals and mental gamepads and more efficient brain-signal processing techniques

    Two photon interrogation of hippocampal subregions CA1 and CA3 during spatial behaviour

    Get PDF
    The hippocampus is crucial for spatial navigation and episodic memory formation. Hippocampal place cells exhibit spatially selective activity within an environment and form the neural basis of a cognitive map of space which supports these mnemonic functions. Hebb’s (1949) postulate regarding the creation of cell assemblies is seen as the pre-eminent model of learning in neural systems. Investigating changes to the hippocampal representation of space during an animal’s exploration of its environment provides an opportunity to observe Hebbian learning at the population and single cell level. When exploring new environments animals form spatial memories that are updated with experience and retrieved upon re-exposure to the same environment, but how this is achieved by different subnetworks in hippocampal CA1 and CA3, and how these circuits encode distinct memories of similar objects and events remains unclear. To test these ideas, we developed an experimental strategy and detailed protocols for simultaneously recording from CA1 and CA3 populations with 2P imaging. We also developed a novel all-optical protocol to simultaneously activate and record from ensembles of CA3 neurons. We used these approaches to show that targeted activation of CA3 neurons results in an increasing excitatory amplification seen only in CA3 cells when stimulating other CA3 cells, and not in CA1, perhaps reflecting the greater number of recurrent connections in CA3. To probe hippocampal spatial representations, we titrated input to the network by morphing VR environments during spatial navigation to assess the local CA3 as well as downstream CA1 responses. To this end, we found CA1 and CA3 neural population responses behave nonlinearly, consistent with attractor dynamics associated with the two stored representations. We interpret our findings as supporting classic theories of Hebbian learning and as the beginning of uncovering the relationship between hippocampal neural circuit activity and the computations implemented by their dynamics. Establishing this relationship is paramount to demystifying the neural underpinnings of cognition

    Brain Dynamics of Spatial Reference Frame Proclivity in Active Navigation.

    Full text link
    Recent research into navigation strategy of different spatial reference frames (self-centered egocentric reference frame and environment-centered allocentric reference frame) has revealed that the parietal cortex plays an important role in processing allocentric information to provide a translation function between egocentric and allocentric spatial reference frames. However, most studies merely focused on a passive experimental environment, which is not truly representative of our daily spatial learning/navigation tasks. This study investigated the factor associated with brain dynamics that causes people to switch their preferred spatial strategy in both active and passive navigations to bridge the gap. Virtual reality (VR) technique and Omni treadmill are applied to realize actively walking for active navigation, and for passive navigation, participants were sitting while conducting the same task. Electroencephalography (EEG) signals were recorded to monitor spectral perturbations on transitions between egocentric and allocentric frames during a path integration task. Forty-one right-handed male participants from authors' university participated this study. Our brain dynamics results showed navigation involved areas including the parietal cortex with modulation in the alpha band, the occipital cortex with beta and low gamma band perturbations, and the frontal cortex with theta perturbation. Differences were found between two different turning-angle paths in the alpha band in parietal cluster event-related spectral perturbations (ERSPs). In small turning-angle paths, allocentric participants showed stronger alpha desynchronization than egocentric participants; in large turning-angle paths, participants for two reference frames had a smaller difference in the alpha frequency band. Behavior results of homing errors also corresponded to brain dynamic results, indicating that a larger angle path caused the allocentric to have a higher tendency to become egocentric navigators in the active navigation environment

    Navigation in Real-World Environments: New Opportunities Afforded by Advances in Mobile Brain Imaging

    Get PDF
    A central question in neuroscience and psychology is how the mammalian brain represents the outside world and enables interaction with it. Significant progress on this question has been made in the domain of spatial cognition, where a consistent network of brain regions that represent external space has been identified in both humans and rodents. In rodents, much of the work to date has been done in situations where the animal is free to move about naturally. By contrast, the majority of work carried out to date in humans is static, due to limitations imposed by traditional laboratory based imaging techniques. In recent years, significant progress has been made in bridging the gap between animal and human work by employing virtual reality (VR) technology to simulate aspects of real-world navigation. Despite this progress, the VR studies often fail to fully simulate important aspects of real-world navigation, where information derived from self-motion is integrated with representations of environmental features and task goals. In the current review article, we provide a brief overview of animal and human imaging work to date, focusing on commonalties and differences in findings across species. Following on from this we discuss VR studies of spatial cognition, outlining limitations and developments, before introducing mobile brain imaging techniques and describe technical challenges and solutions for real-world recording. Finally, we discuss how these advances in mobile brain imaging technology, provide an unprecedented opportunity to illuminate how the brain represents complex multifaceted information during naturalistic navigation

    Navigation in real-world environments : new opportunities afforded by advances in mobile brain imaging

    Get PDF
    A central question in neuroscience and psychology is how the mammalian brain represents the outside world and enables interaction with it. Significant progress on this question has been made in the domain of spatial cognition, where a consistent network of brain regions that represent external space has been identified in both humans and rodents. In rodents, much of the work to date has been done in situations where the animal is free to move about naturally. By contrast, the majority of work carried out to date in humans is static, due to limitations imposed by traditional laboratory based imaging techniques. In recent years, significant progress has been made in bridging the gap between animal and human work by employing virtual reality (VR) technology to simulate aspects of real-world navigation. Despite this progress, the VR studies often fail to fully simulate important aspects of real-world navigation, where information derived from self-motion is integrated with representations of environmental features and task goals. In the current review article, we provide a brief overview of animal and human imaging work to date, focusing on commonalties and differences in findings across species. Following on from this we discuss VR studies of spatial cognition, outlining limitations and developments, before introducing mobile brain imaging techniques and describe technical challenges and solutions for real-world recording. Finally, we discuss how these advances in mobile brain imaging technology, provide an unprecedented opportunity to illuminate how the brain represents complex multifaceted information during naturalistic navigation.Publisher PDFPeer reviewe

    Electrophysiological Signatures of Spatial Boundaries in the Human Subiculum.

    Get PDF
    Environmental boundaries play a crucial role in spatial navigation and memory across a wide range of distantly related species. In rodents, boundary representations have been identified at the single-cell level in the subiculum and entorhinal cortex of the hippocampal formation. Although studies of hippocampal function and spatial behavior suggest that similar representations might exist in humans, boundary-related neural activity has not been identified electrophysiologically in humans until now. To address this gap in the literature, we analyzed intracranial recordings from the hippocampal formation of surgical epilepsy patients (of both sexes) while they performed a virtual spatial navigation task and compared the power in three frequency bands (1-4, 4-10, and 30-90 Hz) for target locations near and far from the environmental boundaries. Our results suggest that encoding locations near boundaries elicited stronger theta oscillations than for target locations near the center of the environment and that this difference cannot be explained by variables such as trial length, speed, movement, or performance. These findings provide direct evidence of boundary-dependent neural activity localized in humans to the subiculum, the homolog of the hippocampal subregion in which most boundary cells are found in rodents, and indicate that this system can represent attended locations that rather than the position of one\u27s own body

    A Neuronal Circuit Architecture for Angular Integration in Drosophila

    Get PDF
    While navigating their environment, many animals keep track of their angular heading over time. However, a neuronal-circuit architecture for computing heading remains unknown in any species. In this thesis, I describe a set of neurons in the Drosophila central complex whose wiring and physiology provide a means to shift an angular heading estimate when the fly turns. I show that these clockwise- and counterclockwise-shifting neurons each exist in two subtypes, with spatiotemporal activity profiles that suggest opposing roles for each subtype at the start and end of a turn. Shifting neurons are required for the heading system to properly track the fly\u27s heading in the dark, and their stimulation induces a shift in the heading signal in the expected direction. I also provide evidence that the angular position of visual landmarks is flexibly associated with the fly’s internal heading estimate as it explores its environment. A specific circuit-level model based on known cell types is proposed to account for this flexible association. The central features of the biological circuits described here are analogous to computational models proposed for headdirection cells in rodents and may inform how neural systems, in general, perform angular calculations
    corecore