98,097 research outputs found
[DC] self-adaptive technologies for immersive trainings
Online learning is the preferred option for professional training, e.g. Industry 4.0 or e-health, because it is more cost efficient than on-site organisation of realistic training sessions. However, current online learning technologies are limited in terms of personalisation, interactivity and immersiveness that are required by applications such as surgery and pilot training. Virtual Reality (VR) technologies have the potential to overcome these limitations. However, due to its early stage of research, VR requires significant improvements to fully unlock its potential. The focus of this PhD is to tackle research challenges to enable VR for online training in three dimensions: (1) dynamic adaptation of the training content for personalised trainings, by incorporating prior knowledge and context data into self-learning algorithms; (2) mapping of sensor data onto what happens in the VR environment, by focusing on motion prediction techniques that use past movements of the users, and (3) investigating immersive environments with intuitive interactions, by gaining a better understanding of human motion in order to improve interaction. The designed improvements will be characterised though a prototype VR training platform for multiple use cases. This work will not only advance the state of the art on VR training, but also on online e-learning applications in general
Immersion on the Edge: A Cooperative Framework for Mobile Immersive Computing
Immersive computing (IC) technologies such as virtual reality and augmented
reality are gaining tremendous popularity. In this poster, we present CoIC, a
Cooperative framework for mobile Immersive Computing. The design of CoIC is
based on a key insight that IC tasks among different applications or users
might be similar or redundant. CoIC enhances the performance of mobile IC
applications by caching and sharing computation-intensive IC results on the
edge. Our preliminary evaluation results on an AR application show that CoIC
can reduce the recognition and rendering latency by up to 52.28% and 75.86%
respectively on current mobile devices.Comment: This poster has been accepted by the SIGCOMM in June 201
Flexible virtual environments: Gamifying immersive learning
© Springer International Publishing AG 2017. The availability of Virtual Reality (VR) and Virtual Environment (VE) equipment - with the launch of domestic technologies such as the Oculus Rift, Microsoft Hololens and Sony Playstation VR) - offer new ways to enable interactive immersive experiences [16]. The opportunities these create in learning and training applications are immense: but create new challenges . Meanwhile, current virtual learning environments are typically web or app based technologies, sometimes perceived as having little value added from a user perspective beyond improved User Interfaces to access some content [6]. The challenge is how the human computer interaction features of such VE platforms may be used in education in a way that adds value, especially for computer mediated instruction. This paper will outline some of the issues, and opportunities, as well as some of the open questions about how such technologies can be used effectively in a higher education context, along with a proposed framework for embedding a learning engine within a virtual reality or environment system. Three-dimensional technologies: from work-walls, through CAVES to the latest headsets offer new ways to immerse users in computer generated environments. Immersive learning [1] is increasingly common in training applications, and is beginning to make inroads into formal education. The recent rise in such off-the-shelf technologies means that Augmented Learning becomes a realistic mainstream tool [13]. Much of this use is built in game environments using game engines, where these serious games provide learning effects as an intended consequence of playing
Exploring the Use of Virtual Worlds as a Scientific Research Platform: The Meta-Institute for Computational Astrophysics (MICA)
We describe the Meta-Institute for Computational Astrophysics (MICA), the
first professional scientific organization based exclusively in virtual worlds
(VWs). The goals of MICA are to explore the utility of the emerging VR and VWs
technologies for scientific and scholarly work in general, and to facilitate
and accelerate their adoption by the scientific research community. MICA itself
is an experiment in academic and scientific practices enabled by the immersive
VR technologies. We describe the current and planned activities and research
directions of MICA, and offer some thoughts as to what the future developments
in this arena may be.Comment: 15 pages, to appear in the refereed proceedings of "Facets of Virtual
Environments" (FaVE 2009), eds. F. Lehmann-Grube, J. Sablating, et al., ICST
Lecture Notes Ser., Berlin: Springer Verlag (2009); version with full
resolution color figures is available at
http://www.mica-vw.org/wiki/index.php/Publication
Exploring the Design Space of Immersive Urban Analytics
Recent years have witnessed the rapid development and wide adoption of
immersive head-mounted devices, such as HTC VIVE, Oculus Rift, and Microsoft
HoloLens. These immersive devices have the potential to significantly extend
the methodology of urban visual analytics by providing critical 3D context
information and creating a sense of presence. In this paper, we propose an
theoretical model to characterize the visualizations in immersive urban
analytics. Further more, based on our comprehensive and concise model, we
contribute a typology of combination methods of 2D and 3D visualizations that
distinguish between linked views, embedded views, and mixed views. We also
propose a supporting guideline to assist users in selecting a proper view under
certain circumstances by considering visual geometry and spatial distribution
of the 2D and 3D visualizations. Finally, based on existing works, possible
future research opportunities are explored and discussed.Comment: 23 pages,11 figure
Visualising mixed reality simulation for multiple users
Cowling, MA ORCiD: 0000-0003-1444-1563Blended reality seeks to encourage co-presence in the classroom, blending student experience across virtual and physical worlds. In a similar way, Mixed Reality, a continuum between virtual and real environments, is now allowing learners to work in both the physical and the digital world simultaneously, especially when combined with an immersive headset experience. This experience provides innovative new experiences for learning, but faces the challenge that most of these experiences are single user, leaving others outside the new environment. The question therefore becomes, how can a mixed reality simulation be experienced by multiple users, and how can we present that simulation effectively to users to create a true blended reality environment? This paper proposes a study that uses existing screen production research into the user and spectator to produce a mixed reality simulation suitable for multiple users. A research method using Design Based Research is also presented to assess the usability of the approach
- …
