18,845 research outputs found
PerfVis: Pervasive Visualization in Immersive AugmentedReality for Performance Awareness
Developers are usually unaware of the impact of code changes to the
performance of software systems. Although developers can analyze the
performance of a system by executing, for instance, a performance test to
compare the performance of two consecutive versions of the system, changing
from a programming task to a testing task would disrupt the development flow.
In this paper, we propose the use of a city visualization that dynamically
provides developers with a pervasive view of the continuous performance of a
system. We use an immersive augmented reality device (Microsoft HoloLens) to
display our visualization and extend the integrated development environment on
a computer screen to use the physical space. We report on technical details of
the design and implementation of our visualization tool, and discuss early
feedback that we collected of its usability. Our investigation explores a new
visual metaphor to support the exploration and analysis of possibly very large
and multidimensional performance data. Our initial result indicates that the
city metaphor can be adequate to analyze dynamic performance data on a large
and non-trivial software system.Comment: ICPE'19 vision, 4 pages, 2 figure, conferenc
Reflections on the use of Project Wonderland as a mixed-reality environment for teaching and learning
This paper reflects on the lessons learnt from MiRTLE?a collaborative research project to create a ?mixed reality teaching and learning environment? that enables teachers and students participating in real-time mixed and online classes to interact with avatar representations of each other. The key hypothesis of the project is that avatar representations of teachers and students can help create a sense of shared presence, engendering a greater sense of community and improving student engagement in online lessons. This paper explores the technology that underpins such environments by presenting work on the use of a massively multi-user game server, based on Sun?s Project Darkstar and Project Wonderland tools, to create a shared teaching environment, illustrating the process by describing the creation of a virtual classroom. It is planned that the MiRTLE platform will be used in several trial applications ? which are described in the paper. These example applications are then used to explore some of the research issues arising from the use of virtual environments within an education environment. The research discussion initially focuses on the plans to assess this within the MiRTLE project. This includes some of the issues of designing virtual environments for teaching and learning, and how supporting pedagogical and social theories can inform this process
Remote Real-Time Collaboration Platform enabled by the Capture, Digitisation and Transfer of Human-Workpiece Interactions
In this highly globalised manufacturing ecosystem, product design and verification activities, production and inspection processes, and technical support services are spread across global supply chains and customer networks. Therefore, a platform for global teams to collaborate with each other in real-time to perform complex tasks is highly desirable. This work investigates the design and development of a remote real-time collaboration platform by using human motion capture technology powered by infrared light based depth imaging sensors borrowed from the gaming industry. The unique functionality of the proposed platform is the sharing of physical contexts during a collaboration session by not only exchanging human actions but also the effects of those actions on the task environment. This enables teams to remotely work on a common task problem at the same time and also get immediate feedback from each other which is vital for collaborative design, inspection and verifications tasks in the factories of the future
Recommended from our members
Human-display interaction technology: Emerging remote interfaces for pervasive display environments
This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2010 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.We're living in a world where information processing isn't confined to desktop computers - it's being integrated into everyday objects and activities. Pervasive computation is human centered: it permeates our physical world, helping us achieve goals and fulfill our needs with minimum effort by exploiting natural interaction styles. Remote interaction with screen displays requires a sensor-based, multimodal, touchless approach. For example, by processing user hand gestures, this paradigm removes constraints requiring physical contact and permits natural interaction with tangible digital information. Such touchless interaction can be multimodal, exploiting the visual, auditory, and olfactory senses.Ministerio de Educación y Ciencia and Amper Sistemas, SA
- …