4,347 research outputs found

    Refining personal and social presence in virtual meetings

    Get PDF
    Virtual worlds show promise for conducting meetings and conferences without the need for physical travel. Current experience suggests the major limitation to the more widespread adoption and acceptance of virtual conferences is the failure of existing environments to provide a sense of immersion and engagement, or of ‘being there’. These limitations are largely related to the appearance and control of avatars, and to the absence of means to convey non-verbal cues of facial expression and body language. This paper reports on a study involving the use of a mass-market motion sensor (Kinect™) and the mapping of participant action in the real world to avatar behaviour in the virtual world. This is coupled with full-motion video representation of participant’s faces on their avatars to resolve both identity and facial expression issues. The outcomes of a small-group trial meeting based on this technology show a very positive reaction from participants, and the potential for further exploration of these concepts

    Developing knowledge for real world problem scenarios : using 3D gaming technology within a problem-based learning framework

    Get PDF
    Problem-based learning is an instructional strategy that emphasises active and experiential learning through problem-solving activity. Using gaming technologies to embed this approach in a three-dimensional (3D) simulation environment provides users with a dynamic, responsive, visually engaging, and cost effective learning experience. Representing real world problems in 3D simulation environments develops knowledge and skills that are applicable to their resolution. The Simulation, User, and Problem-based Learning (SUPL) Design Framework was developed to inform the design of learning environments which develop problem-solving knowledge for real world application. This framework identifies design factors relative to the user, the problem-solving task, and the 3D simulation environment which facilitate the transfer, development, and application of problem-solving knowledge. To assess the validity of the SUPL Design Framework, the Fires in Underground Mines Evacuation Simulator (FUMES) was developed to train mining personnel in emergency evacuation procedures at the Challenger gold mine in South Australia. Two groups of participants representing experienced and novice personnel were utilised to ascertain the effectiveness of FUMES as a training platform in this regard. Findings demonstrated that FUMES accurately represented emergency evacuation scenarios in the Challenger mine. Participants were able to utilise existing real world knowledge in FUMES to resolve emergency evacuation problem-solving tasks and develop new knowledge. The effectiveness of the SUPL Design Framework was also demonstrated, as was the need to design learning environments to meet the learning needs of users rather than merely as static simulations of real world problems. A series of generalisable design guidelines were also established from these findings which could be applied to design problem-based learning simulations in other training contexts

    Haptic Interaction in 3D Stereoscopic User Interfaces

    Get PDF
    Haptic feedback is an area of technology that utilizes the sense of touch, by providing tactile interaction. It has been integrated into gaming consoles and mobile devices, and has been researched for its potential in programs that range from medical training simulations to collaborative workspaces. 3D stereo display is another growing facet of technology that is reexamining the possibilities of the user experience. The zSpace system is a computing hardware platform that simulates realistic, holographic, 3D stereoscopic vision. Using this system, this research project aimed to study how haptic feedback can enhance the user interface and understanding of 3D virtual space, by applying and exploring the effects of different types of haptic interaction in two zSpace applications. User experience in haptic and non-haptic versions of these programs was evaluated through a comparative analysis of various measures including observation, performance, presence, and workload

    Designing to Support Workspace Awareness in Remote Collaboration using 2D Interactive Surfaces

    Get PDF
    Increasing distributions of the global workforce are leading to collaborative workamong remote coworkers. The emergence of such remote collaborations is essentiallysupported by technology advancements of screen-based devices ranging from tabletor laptop to large displays. However, these devices, especially personal and mobilecomputers, still suffer from certain limitations caused by their form factors, that hinder supporting workspace awareness through non-verbal communication suchas bodily gestures or gaze. This thesis thus aims to design novel interfaces andinteraction techniques to improve remote coworkers’ workspace awareness throughsuch non-verbal cues using 2D interactive surfaces.The thesis starts off by exploring how visual cues support workspace awareness infacilitated brainstorming of hybrid teams of co-located and remote coworkers. Basedon insights from this exploration, the thesis introduces three interfaces for mobiledevices that help users maintain and convey their workspace awareness with their coworkers. The first interface is a virtual environment that allows a remote person to effectively maintain his/her awareness of his/her co-located collaborators’ activities while interacting with the shared workspace. To help a person better express his/her hand gestures in remote collaboration using a mobile device, the second interfacepresents a lightweight add-on for capturing hand images on and above the device’sscreen; and overlaying them on collaborators’ device to improve their workspace awareness. The third interface strategically leverages the entire screen space of aconventional laptop to better convey a remote person’s gaze to his/her co-locatedcollaborators. Building on the top of these three interfaces, the thesis envisions an interface that supports a person using a mobile device to effectively collaborate with remote coworkers working with a large display.Together, these interfaces demonstrate the possibilities to innovate on commodity devices to offer richer non-verbal communication and better support workspace awareness in remote collaboration

    Modulating the performance of VR navigation tasks using different methods of presenting visual information

    Get PDF
    Spatial navigation is an essential ability in our daily lives that we use to move through different locations. In Virtual Reality (VR), the environments that users navigate may be large and similar to real world places. It is usually desirable to guide users in order to prevent them from getting lost and to make it easier for them to reach the goal or discover important spots in the environment. However, doing so in a way that the guidance is not intrusive, breaking the immersion and sense of presence, nor too hard to notice, therefore not being useful, can be a challenge. In this work we conducted an experiment in which we adapted a probabilistic learning paradigm: the Weather Prediction task to spatial navigation in VR. Subjects navigated one of the two versions of procedurally generated T-junction mazes in Virtual Reality. In one version, the environment contained visual cues in the form of street signs whose presence predicted the correct turning direction. In the other version the cues were present, but were not predictive. Results showed that when subjects navigated the mazes with the predictive cues they made less mistakes, and therefore the cues helped them navigate the environments. A comparison with previous Neuroscience literature revealed that the strategies used by subjects to solve the task were different than in the original 2D experiment. This work is intended to be used as a basis to further improve spatial navigation in VR with more immersive and implicit methods, and as another example of how the Cognitive Neurosicence and Virtual Reality research fields can greatly benefit each other

    Immersive Visualization in Biomedical Computational Fluid Dynamics and Didactic Teaching and Learning

    Get PDF
    Virtual reality (VR) can stimulate active learning, critical thinking, decision making and improved performance. It requires a medium to show virtual content, which is called a virtual environment (VE). The MARquette Visualization Lab (MARVL) is an example of a VE. Robust processes and workflows that allow for the creation of content for use within MARVL further increases the userbase for this valuable resource. A workflow was created to display biomedical computational fluid dynamics (CFD) and complementary data in a wide range of VE’s. This allows a researcher to study the simulation in its natural three-dimensional (3D) morphology. In addition, it is an exciting way to extract more information from CFD results by taking advantage of improved depth cues, a larger display canvas, custom interactivity, and an immersive approach that surrounds the researcher. The CFD to VR workflow was designed to be basic enough for a novice user. It is also used as a tool to foster collaboration between engineers and clinicians. The workflow aimed to support results from common CFD software packages and across clinical research areas. ParaView, Blender and Unity were used in the workflow to take standard CFD files and process them for viewing in VR. Designated scripts were written to automate the steps implemented in each software package. The workflow was successfully completed across multiple biomedical vessels, scales and applications including: the aorta with application to congenital cardiovascular disease, the Circle of Willis with respect to cerebral aneurysms, and the airway for surgical treatment planning. The workflow was completed by novice users in approximately an hour. Bringing VR further into didactic teaching within academia allows students to be fully immersed in their respective subject matter, thereby increasing the students’ sense of presence, understanding and enthusiasm. MARVL is a space for collaborative learning that also offers an immersive, virtual experience. A workflow was created to view PowerPoint presentations in 3D using MARVL. A resulting Immersive PowerPoint workflow used PowerPoint, Unity and other open-source software packages to display the PowerPoint presentations in 3D. The Immersive PowerPoint workflow can be completed in under thirty minutes

    UNBODY: A Poetry Escape Room in Augmented Reality

    Get PDF
    The integration of augmented reality (AR) technology into personal computing is happening fast, and augmented workplaces for professionals in areas such as Industry 4.0 or digital health can reasonably be expected to form liminal zones that push the boundary of what currently possible. The application potential in the creative industries, however, is vast and can target broad audiences, so with UNBODY, we set out to push boundaries of a different kind and depart from the graphic-centric worlds of AR to explore textual and aural dimensions of an extended reality, in which words haunt and re-create our physical selves. UNBODY is an AR installation for smart glasses that embeds poetry in the user’s surroundings. The augmented experience turns reality into a medium where holographic texts and film clips spill from dayglow billboards and totems. In this paper, we develop a blueprint for an AR escape room dedicated to the spoken and written word, with its open source code facilitating uptake by others into existing or new AR escape rooms. We outline the user-centered process of designing, building, and evaluating UNBODY. More specifically, we deployed a system usability scale (SUS) and a spatial interaction evaluation (SPINE) in order to validate its wider applicability. In this paper, we also describe the composition and concept of the experience, identifying several components (trigger posters, posters with video overlay, word dropper totem, floating object gallery, and a user trail visualization) as part of our first version before evaluation. UNBODY provides a sense of situational awareness and immersivity from inside an escape room. The recorded average mean for the SUS was 59.7, slightly under the recommended 68 average but still above ‘OK’ in the zone of low marginal acceptable. The findings for the SPINE were moderately positive, with the highest scores for output modalities and navigation support. This indicated that the proposed components and escape room concept work. Based on these results, we improved the experience, adding, among others, an interactive word composer component. We conclude that a poetry escape room is possible, outline our co-creation process, and deliver an open source technical framework as a blueprint for adding enhanced support for the spoken and written word to existing or coming AR escape room experiences. In an outlook, we discuss additional insight on timing, alignment, and the right level of personalization
    • …
    corecore