31 research outputs found

    Augmenting Immersive Telepresence Experience with a Virtual Body

    Full text link
    We propose augmenting immersive telepresence by adding a virtual body, representing the user's own arm motions, as realized through a head-mounted display and a 360-degree camera. Previous research has shown the effectiveness of having a virtual body in simulated environments; however, research on whether seeing one's own virtual arms increases presence or preference for the user in an immersive telepresence setup is limited. We conducted a study where a host introduced a research lab while participants wore a head-mounted display which allowed them to be telepresent at the host's physical location via a 360-degree camera, either with or without a virtual body. We first conducted a pilot study of 20 participants, followed by a pre-registered 62 participant confirmatory study. Whereas the pilot study showed greater presence and preference when the virtual body was present, the confirmatory study failed to replicate these results, with only behavioral measures suggesting an increase in presence. After analyzing the qualitative data and modeling interactions, we suspect that the quality and style of the virtual arms, and the contrast between animation and video, led to individual differences in reactions to the virtual body which subsequently moderated feelings of presence.Comment: Accepted for publication in Transactions in Visualization and Computer Graphics (TVCG), to be presented in IEEE VR 202

    Virtual Reality Sickness Reduces Attention During Immersive Experiences

    Full text link
    In this paper, we show that Virtual Reality (VR) sickness is associated with a reduction in attention, which was detected with the P3b Event-Related Potential (ERP) component from electroencephalography (EEG) measurements collected in a dual-task paradigm. We hypothesized that sickness symptoms such as nausea, eyestrain, and fatigue would reduce the users' capacity to pay attention to tasks completed in a virtual environment, and that this reduction in attention would be dynamically reflected in a decrease of the P3b amplitude while VR sickness was experienced. In a user study, participants were taken on a tour through a museum in VR along paths with varying amounts of rotation, shown previously to cause different levels of VR sickness. While paying attention to the virtual museum (the primary task), participants were asked to silently count tones of a different frequency (the secondary task). Control measurements for comparison against the VR sickness conditions were taken when the users were not wearing the Head-Mounted Display (HMD) and while they were immersed in VR but not moving through the environment. This exploratory study shows, across multiple analyses, that the effect mean amplitude of the P3b collected during the task is associated with both sickness severity measured after the task with a questionnaire (SSQ) and with the number of counting errors on the secondary task. Thus, VR sickness may impair attention and task performance, and these changes in attention can be tracked with ERP measures as they happen, without asking participants to assess their sickness symptoms in the moment

    A multimodal cell census and atlas of the mammalian primary motor cortex

    Get PDF
    ABSTRACT We report the generation of a multimodal cell census and atlas of the mammalian primary motor cortex (MOp or M1) as the initial product of the BRAIN Initiative Cell Census Network (BICCN). This was achieved by coordinated large-scale analyses of single-cell transcriptomes, chromatin accessibility, DNA methylomes, spatially resolved single-cell transcriptomes, morphological and electrophysiological properties, and cellular resolution input-output mapping, integrated through cross-modal computational analysis. Together, our results advance the collective knowledge and understanding of brain cell type organization: First, our study reveals a unified molecular genetic landscape of cortical cell types that congruently integrates their transcriptome, open chromatin and DNA methylation maps. Second, cross-species analysis achieves a unified taxonomy of transcriptomic types and their hierarchical organization that are conserved from mouse to marmoset and human. Third, cross-modal analysis provides compelling evidence for the epigenomic, transcriptomic, and gene regulatory basis of neuronal phenotypes such as their physiological and anatomical properties, demonstrating the biological validity and genomic underpinning of neuron types and subtypes. Fourth, in situ single-cell transcriptomics provides a spatially-resolved cell type atlas of the motor cortex. Fifth, integrated transcriptomic, epigenomic and anatomical analyses reveal the correspondence between neural circuits and transcriptomic cell types. We further present an extensive genetic toolset for targeting and fate mapping glutamatergic projection neuron types toward linking their developmental trajectory to their circuit function. Together, our results establish a unified and mechanistic framework of neuronal cell type organization that integrates multi-layered molecular genetic and spatial information with multi-faceted phenotypic properties

    Does the Brain’s Sensitivity to Statistical Regularity Require Attention?

    No full text

    Typical viewpoints of objects are better detected than atypical ones

    No full text
    Abstract Previous work has claimed that canonical viewpoints of objects are more readily perceived than noncanonical viewpoints. However, all of these studies required participants to identify the object, a late perceptual process at best and arguably a cognitive process (Pylyshyn, 1999). Here, we extend this work to early vision by removing the explicit need to identify the objects. In particular, we asked participants to make an intact/scrambled discrimination of briefly presented objects that were viewed from either typical or atypical viewpoints. Notably, participants did not have to identify the object; only discriminate it from noise (scrambled). Participants were more sensitive in discriminating objects presented in typically encountered orientations than when objects were presented in atypical depth rotations (Experiment 1). However, the same effect for objects presented in atypical picture plane rotations (as opposed to typical ones) did not reach statistical significance (Experiments 2 and 3), suggesting that particular informative views may play a critical role in this effect. We interpret this enhanced perceptibility, for both these items and good exemplars and probable scenes, as deriving from their high real-world statistical regularity

    From Virtual Reality to the Emerging Discipline of Perception Engineering

    No full text
    This article makes the case that a powerful new discipline, which we term perception engineering, is steadily emerging. It follows from a progression of ideas that involve creating illusions, from historical paintings and film to modern video games and virtual reality. Rather than creating physical artifacts such as bridges, airplanes, or computers, perception engineers create illusory perceptual experiences. The scope is defined over any agent that interacts with the physical world, including both biological organisms (humans and animals) and engineered systems (robots and autonomous systems). The key idea is that an agent, called a producer, alters the environment with the intent to alter the perceptual experience of another agent, called a receiver. Most importantly, the article introduces a precise mathematical formulation of this process, based on the von Neumann–Morgenstern notion of information, to help scope and define the discipline. This formulation is then applied to the cases of engineered and biological agents, with discussion of its implications for existing fields such as virtual reality, robotics, and even social media. Finally, open challenges and opportunities for involvement are identified.<br/

    The body scaling effect and its impact on physics plausibility

    No full text
    Abstract In this study we investigated the effect of body ownership illusion-based body scaling on physics plausibility in Virtual Reality (VR). Our interest was in examining whether body ownership illusion-based body scaling could affect the plausibility of rigid body dynamics similarly to altering VR users’ scale by manipulating their virtual interpupillary distance and viewpoint height. The procedure involved the conceptual replication of two previous studies. We investigated physics plausibility with 40 participants under two conditions. In our synchronous condition, we used visuo-tactile stimuli to elicit a body ownership illusion of inhabiting an invisible doll-sized body on participants reclining on an exam table. Our asynchronous condition was otherwise similar, but the visuo-tactile stimuli were provided asynchronously to prevent the onset of the body ownership illusion. We were interested in whether the correct approximation of physics (true physics) or physics that are incorrect and appearing as if the environment is five times larger instead (movie physics) appear more realistic to participants as a function of body scale. We found that movie physics did appear more realistic to participants under the body ownership illusion condition. However, our hypothesis that true physics would appear more realistic in the asynchronous condition was unsupported. Our exploratory analyses revealed that movie physics were perceived as plausible under both conditions. Moreover, we were not able to replicate previous findings from literature concerning object size estimations while inhabiting a small invisible body. However, we found a significant opposite effect regarding size estimations; the object sizes were on average underestimated during the synchronous visuo-tactile condition when compared to the asynchronous condition. We discuss these unexpected findings and the potential reasons for the results, and suggest avenues for future research

    Virtual Reality Sickness Reduces Attention During Immersive Experiences

    No full text
    In this paper, we show that Virtual Reality (VR) sickness is associated with a reduction in attention, which was detected with the P3b Event-Related Potential (ERP) component from electroencephalography (EEG) measurements collected in a dual-task paradigm. We hypothesized that sickness symptoms such as nausea, eyestrain, and fatigue would reduce the users' capacity to pay attention to tasks completed in a virtual environment, and that this reduction in attention would be dynamically reflected in a decrease of the P3b amplitude while VR sickness was experienced. In a user study, participants were taken on a tour through a museum in VR along paths with varying amounts of rotation, shown previously to cause different levels of VR sickness. While paying attention to the virtual museum (the primary task), participants were asked to silently count tones of a different frequency (the secondary task). Control measurements for comparison against the VR sickness conditions were taken when the users were not wearing the Head-Mounted Display (HMD) and while they were immersed in VR but not moving through the environment. This exploratory study shows, across multiple analyses, that the effect mean amplitude of the P3b collected during the task is associated with both sickness severity measured after the task with a questionnaire (SSQ) and with the number of counting errors on the secondary task. Thus, VR sickness may impair attention and task performance, and these changes in attention can be tracked with ERP measures as they happen, without asking participants to assess their sickness symptoms in the moment.</p

    Augmenting immersive telepresence experience with a virtual body

    No full text
    Abstract We propose augmenting immersive telepresence by adding a virtual body, representing the user’s own arm motions, as realized through a head-mounted display and a 360-degree camera. Previous research has shown the effectiveness of having a virtual body in simulated environments; however, research on whether seeing one’s own virtual arms increases presence or preference for the user in an immersive telepresence setup is limited. We conducted a study where a host introduced a research lab while participants wore a head-mounted display which allowed them to be telepresent at the host’s physical location via a 360-degree camera, either with or without a virtual body. We first conducted a pilot study of 20 participants, followed by a pre-registered 62 participant confirmatory study. Whereas the pilot study showed greater presence and preference when the virtual body was present, the confirmatory study failed to replicate these results, with only behavioral measures suggesting an increase in presence. After analyzing the qualitative data and modeling interactions, we suspect that the quality and style of the virtual arms, and the contrast between animation and video, led to individual differences in reactions to the virtual body which subsequently moderated feelings of presence
    corecore