2,811 research outputs found
A framework for realistic 3D tele-immersion
Meeting, socializing and conversing online with a group of people using teleconferencing systems is still quite differ- ent from the experience of meeting face to face. We are abruptly aware that we are online and that the people we are engaging with are not in close proximity. Analogous to how talking on the telephone does not replicate the experi- ence of talking in person. Several causes for these differences have been identified and we propose inspiring and innova- tive solutions to these hurdles in attempt to provide a more realistic, believable and engaging online conversational expe- rience. We present the distributed and scalable framework REVERIE that provides a balanced mix of these solutions. Applications build on top of the REVERIE framework will be able to provide interactive, immersive, photo-realistic ex- periences to a multitude of users that for them will feel much more similar to having face to face meetings than the expe- rience offered by conventional teleconferencing systems
An Advanced, Three-Dimensional Plotting Library for Astronomy
We present a new, three-dimensional (3D) plotting library with advanced
features, and support for standard and enhanced display devices. The library -
S2PLOT - is written in C and can be used by C, C++ and FORTRAN programs on
GNU/Linux and Apple/OSX systems. S2PLOT draws objects in a 3D (x,y,z) Cartesian
space and the user interactively controls how this space is rendered at run
time. With a PGPLOT inspired interface, S2PLOT provides astronomers with
elegant techniques for displaying and exploring 3D data sets directly from
their program code, and the potential to use stereoscopic and dome display
devices. The S2PLOT architecture supports dynamic geometry and can be used to
plot time-evolving data sets, such as might be produced by simulation codes. In
this paper, we introduce S2PLOT to the astronomical community, describe its
potential applications, and present some example uses of the library.Comment: 12 pages, 10 eps figures (higher resolution versions available from
http://astronomy.swin.edu.au/s2plot/paperfigures). The S2PLOT library is
available for download from http://astronomy.swin.edu.au/s2plo
08231 Abstracts Collection -- Virtual Realities
From 1st to 6th June 2008, the Dagstuhl Seminar 08231 ``Virtual Realities\u27\u27 was held in the International Conference and Research Center (IBFI),
Schloss Dagstuhl.
Virtual Reality (VR) is a multidisciplinary area of research aimed at
interactive human-computer mediated simulations of artificial environments.
Typical applications include simulation, training, scientific visualization,
and entertainment. An important aspect of VR-based systems is the
stimulation of the human senses -- typically sight, sound, and touch -- such that a user feels a sense of presence (or immersion) in the virtual environment.
Different applications require different levels of presence, with
corresponding levels of realism, sensory immersion, and spatiotemporal
interactive fidelity.
During the seminar, several participants presented their current
research, and ongoing work and open problems were discussed. Abstracts of
the presentations given during the seminar as well as abstracts of
seminar results and ideas are put together in this paper.
Links to extended abstracts or full papers are provided, if available
Intuitive Robot Teleoperation through Multi-Sensor Informed Mixed Reality Visual Aids
© 2021 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.Mobile robotic systems have evolved to include sensors capable of truthfully describing robot status and operating environment as accurately and reliably as never before. This possibility is challenged by effective sensor data exploitation, because of the cognitive load an operator is exposed to, due to the large amount of data and time-dependency constraints. This paper addresses this challenge in remote-vehicle teleoperation by proposing an intuitive way to present sensor data to users by means of using mixed reality and visual aids within the user interface. We propose a method for organizing information presentation and a set of visual aids to facilitate visual communication of data in teleoperation control panels. The resulting sensor-information presentation appears coherent and intuitive, making it easier for an operator to catch and comprehend information meaning. This increases situational awareness and speeds up decision-making. Our method is implemented on a real mobile robotic system operating outdoor equipped with on-board internal and external sensors, GPS, and a reconstructed 3D graphical model provided by an assistant drone. Experimentation verified feasibility while intuitive and comprehensive visual communication was confirmed through a qualitative assessment, which encourages further developments.Peer reviewe
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
Collaborative virtual reality platform for visualizing space data and mission planning
This paper presents the system architecture of a collaborative virtual environment in which distributed multidisciplinary teams involved in space exploration activities come together and explore areas of scientific interest of a planet for future missions. The aim is to reduce the current challenges of distributed scientific and engineering meetings that prevent the exploitation of
their collaborative potential, as, at present, expertise, tools and datasets are fragmented. This paper investigates the functional characteristics of a software framework that addresses these challenges following the design science research methodology in the context of the space industry and research.
An implementation of the proposed architecture and a validation process with end users, based on the execution of different use cases, are described. These use cases cover relevant aspects of real science analysis and operation, including planetary data visualization, as the system aims at being used in future European missions. This validation suggests that the system has the
potential to enhance the way space scientists will conduct space science research in the future
- âŠ