114,349 research outputs found
Effects of Handling Real Objects and Self-Avatar Fidelity on Cognitive Task Performance and Sense of Presence in Virtual Environments
Immersive virtual environments (VEs) provide participants with computer-generated environments filled with virtual objects to assist in learning, training, and practicing dangerous and/or expensive tasks. But does having every object being virtual inhibit the interactivity and effectiveness for certain tasks? Further, does the visual fidelity of the virtual objects affect performance? If participants spent most of their time and cognitive load on learning and adapting to interacting with a purely virtual system, this could reduce the overall effectiveness of a VE. We conducted a study that investigated how handling real objects and self-avatar visual fidelity affects performance on a spatial cognitive manual task. We compared participants' performance of a block arrangement task in both a real-space environment and several virtual and hybrid environments. The results showed that manipulating real objects in a VE brings task performance closer to that of real space, compared to manipulating virtual objects
The Importance of Hand Motions for Communication and Interaction in Virtual Reality
Virtual reality (VR) is a growing method of communication and play. Recent advances have enabled hand-tracking technologies for consumer VR headsets, allowing virtual hands to mimic a user\u27s real hand movements in real-time. A growing number of users now utilize hand-tracking when using VR to manipulate objects or to create gestures when interacting with others. As VR grows as a tool and communication platform, it is important to understand how the rising prevalence of hand-tracking technology might affect users\u27 experiences.
The goal of this dissertation is to investigate, through a series of experiments, how using hand motions in VR influences our experience when we communicate with others or interact with the environment. In our daily lives hand motions play a major role in interpersonal communication. Our hands can help emphasize or clarify our speech, or even supplement words entirely. When interacting with the world, hands are our primary tool for manipulating objects and performing dexterous tasks. Bringing these capabilities into VR, a space that has so far been lacking in such detailed expression and interaction, may have unexpected effects.
Overall, we show that using hand-tracking and hand motions in VR is beneficial to many metrics that are used to measure the quality of experiences in virtual environments. When using accurate hand motions, people feel more comfortable and embodied within their virtual avatars, or they feel more socially present. We recommend tracking and displaying hand motions in virtual environments if embodiment or communication are the most important criteria
A Prototype that Fuses Virtual Reality, Robots, and Social Networks to Create a New Cyber–Physical–Social Eco-Society System for Cultural Heritage
With the rapid development of technology and the increasing use of social networks, many opportunities for the design and deployment of interconnected systems arise that could enable a paradigm shift in the ways we interact with cultural heritage. The project described in this paper aims to create a new type of conceptually led environment, a kind of Cyber−Physical−Social Eco-Society (CPSeS) system that would seamlessly blend the real with virtual worlds interactively using Virtual Reality, Robots, and Social Networking technologies, engendered by humans’ interactions and intentions. The project seeks to develop new methods of engaging the current generation of museum visitors, who are influenced by their exposure to modern technology such as social media, smart phones, Internet of Things, smart devices, and visual games, by providing a unique experience of exploring and interacting with real and virtual worlds simultaneously. The research envisions a system that connects visitors to events and/or objects separated either in time or in space, or both, providing social meeting points between them. To demonstrate the attributes of the proposed system, a Virtual Museum scenario has been chosen. The following pages will describe the RoboSHU: Virtual Museum prototype, its capabilities and features, and present a generic development framework that will also be applicable to other contexts and sociospatial domains
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
Reverberation and its Binaural Reproduction: The Trade-off between Computational Efficiency and Perceived Quality
Accurately rendering reverberation is critical to produce realistic binaural audio, particularly in augmented reality applications where virtual objects must blend in seamlessly with real ones. However, rigorously simulating sound waves interacting with the auralised space can be computationally costly, sometimes to the point of being unfeasible in real time applications on resource-limited mobile platforms. Luckily, knowledge of auditory perception can be leveraged to make computational savings without compromising quality. This chapter reviews different approaches and methods for rendering binaural reverberation efficiently, focusing specifically on Ambisonics-based techniques aimed at reducing the spatial resolution of late reverberation components. Potential future research directions in this area are also discussed
A Dose of Reality: Overcoming Usability Challenges in VR Head-Mounted Displays
We identify usability challenges facing consumers adopting Virtual Reality (VR) head-mounted displays (HMDs) in a survey of 108 VR HMD users. Users reported significant issues in interacting with, and being aware of their real-world context when using a HMD. Building upon existing work on blending real and virtual environments, we performed three design studies to address these usability concerns. In a typing study, we show that augmenting VR with a view of reality significantly corrected the performance impairment of
typing in VR. We then investigated how much reality should be incorporated and when, so as to preserve users’ sense of presence in VR. For interaction with objects and peripherals, we found that selectively presenting reality as users engaged with it was optimal in terms of performance and users’ sense of presence. Finally, we investigated how this selective, engagement-dependent approach could be applied in social environments, to support the user’s awareness of the proximity and presence of others
Substitutional reality:using the physical environment to design virtual reality experiences
Experiencing Virtual Reality in domestic and other uncontrolled settings is challenging due to the presence of physical objects and furniture that are not usually defined in the Virtual Environment. To address this challenge, we explore the concept of Substitutional Reality in the context of Virtual Reality: a class of Virtual Environments where every physical object surrounding a user is paired, with some degree of discrepancy, to a virtual counterpart. We present a model of potential substitutions and validate it in two user studies. In the first study we investigated factors that affect participants' suspension of disbelief and ease of use. We systematically altered the virtual representation of a physical object and recorded responses from 20 participants. The second study investigated users' levels of engagement as the physical proxy for a virtual object varied. From the results, we derive a set of guidelines for the design of future Substitutional Reality experiences
Collaboration in Augmented Reality: How to establish coordination and joint attention?
Schnier C, Pitsch K, Dierker A, Hermann T. Collaboration in Augmented Reality: How to establish coordination and joint attention? In: Boedker S, Bouvin NO, Lutters W, Wulf V, Ciolfi L, eds. Proceedings of the 12th European Conference on Computer Supported Cooperative Work (ECSCW 2011). Springer-Verlag London; 2011: 405-416.We present an initial investigation from a semi-experimental setting, in which
an HMD-based AR-system has been used for real-time collaboration in a task-oriented scenario (design of a museum exhibition). Analysis points out the specific conditions of interacting in an AR environment and focuses on one particular practical problem for the participants in coordinating their interaction: how to establish joint attention towards the same object or referent. Analysis allows insights into how the pair of users begins to
familarize with the environment, the limitations and opportunities of the setting and how they establish new routines for e.g. solving the ʻjoint attentionʼ-problem
Formation of color-singlet gluon-clusters and inelastic diffractive scattering
This is the extensive follow-up report of a recent Letter in which the
existence of self-organized criticality (SOC) in systems of interacting soft
gluons is proposed, and its consequences for inelastic diffractive scattering
processes are discussed. It is pointed out, that color-singlet gluon-clusters
can be formed in hadrons as a consequence of SOC in systems of interacting soft
gluons, and that the properties of such spatiotemporal complexities can be
probed experimentally by examing inelastic diffractive scattering. Theoretical
arguments and experimental evidences supporting the proposed picture are
presented --- together with the result of a systematic analysis of the existing
data for inelastic diffractive scattering processes performed at different
incident energies, and/or by using different beam-particles. It is shown in
particular that the size- and the lifetime-distributions of such gluon-clusters
can be directly extracted from the data, and the obtained results exhibit
universal power-law behaviors --- in accordance with the expected
SOC-fingerprints. As further consequences of SOC in systems of interacting soft
gluons, the -dependence and the -dependence of the double
differential cross-sections for inelastic diffractive scattering off
proton-target are discussed. Here stands for the four-momentum-transfer
squared, for the missing mass, and for the total c.m.s.
energy. It is shown, that the space-time properties of the color-singlet
gluon-clusters due to SOC, discussed above, lead to simple analytical formulae
for and for , and that the obtained
results are in good agreement with the existing data. Further experiments are
suggested.Comment: 67 pages, including 11 figure
- …