1,971 research outputs found

    Hybrid-Dimensional Visualization and Interaction - Integrating 2D and 3D Visualization with Semi-Immersive Navigation Techniques

    Get PDF
    The integration of 2D visualization and navigation techniques has reached a state where the potential for improvements is relatively low. With 3D-stereoscopy-compatible technology now commonplace not only in research but also in many households, the need for better 3D visualization and navigation techniques has increased. Nevertheless, for the representation of many abstract data such as networks, 2D visualization remains the primary choice. But often such abstract data is associated with spatial data, thereby increasing the need for combining both 2D and 3D visualization and navigation techniques. Here, we discuss a new hybrid-dimensional approach integrating 2D and 3D (stereoscopic) visualization as well as navigation into a semi-immersive virtual environment. This approach is compared to classical 6DOF navigation techniques. Three scientific as well as educational applications are presented: an educational car model, a plant simulation data exploration, and a cellular model with network exploration, each of these combining spatial with associated abstract data. The software is available at: http://Cm4.CELLmicrocosmos.org

    Stereoscopic space map – semi-immersive configuration of 3Dstereoscopic tours in multi-display environments

    Get PDF
    Although large-scale stereoscopic 3D environments like CAVEs are a favorable location for group presentations, the perspective projection and stereoscopic optimization usually follows a navigator-centric approach. Therefore, these presentations are usually accompanied by strong side-effects, such as motion sickness which is often caused by a disturbed stereoscopic vision. The reason is that the stereoscopic visualization is usually optimized for the only head-tracked person in the CAVE – the navigator – ignoring the needs of the real target group – the audience. To overcome this misconception, this work proposes an alternative to the head tracking-based stereoscopic effect optimization. By using an interactive virtual overview map in 3D, the pre-tour and on-tour configuration of the stereoscopic effect is provided, partly utilizing our previously published interactive projection plane approach. This Stereoscopic Space Map is visualized by the zSpace 200¼, whereas the virtual world is shown on a panoramic 330° CAVE2TM. A pilot expert study with eight participants was conducted using pre-configured tours through 3D models. The comparison of the manual and automatic stereoscopic adjustment showed that the proposed approach is an appropriate alternative to the nowadays commonly used head tracking-based stereoscopic adjustment

    3D-Stereoscopic Immersive Analytics Projects at Monash University and University of Konstanz

    Get PDF
    Immersive Analytics investigates how novel interaction and display technologies may support analytical reasoning and decision making. The Immersive Analytics initiative of Monash University started early 2014. Over the last few years, a number of projects have been developed or extended in this context to meet the requirements of semi- or full-immersive stereoscopic environments. Different technologies are used for this purpose: CAVE2ℱ (a 330 degree large-scale visualization environment which can be used for educative and scientific group presentations, analyses and discussions), stereoscopic Powerwalls (miniCAVEs, representing a segment of the CAVE2 and used for development and communication), Fishtanks, and/or HMDs (such as Oculus, VIVE, and mobile HMD approaches). Apart from CAVE2ℱ all systems are or will be employed on both the Monash University and the University of Konstanz side, especially to investigate collaborative Immersive Analytics. In addition, sensiLab extends most of the previous approaches by involving all senses, 3D visualization is combined with multi-sensory feedback, 3D printing, robotics in a scientific-artistic-creative environment

    Innovative Approaches to 3D GIS Modeling for Volumetric and Geoprocessing Applications in Subsurface Infrastructures in a Virtual Immersive Environment

    Get PDF
    As subsurface features remain largely ‘out of sight, out of mind’, this has led to challenges when dealing with underground space and infrastructures and especially so for those working in GIS. Since subsurface infrastructure plays a major role in supporting the needs of modern society, groups such as city planners and utility companies and decision makers are looking for an ‘holistic’ approach where the sustainable use of underground space is as important as above ground space. For such planning and management, it is crucial to examine subsurface data in a form that is amenable to 3D mapping and that can be used for increasingly sophisticated 3D modeling. The subsurface referred to in this study focuses particularly on examples of both shallow and deep underground infrastructures. In the case of shallow underground infrastructures mostly two-dimensional maps are used in the management and planning of these features. Depth is a very critical component of underground infrastructures that is difficult to represent in a 2D map and for this reason these are best studied in three-dimensional space. In this research, the capability of 3D GIS technology and immersive geography are explored for the storage, management, analysis, and visualization of shallow and deep subsurface features

    Stereoscopic bimanual interaction for 3D visualization

    Get PDF
    Virtual Environments (VE) are being widely used in various research fields for several decades such as 3D visualization, education, training and games. VEs have the potential to enhance the visualization and act as a general medium for human-computer interaction (HCI). However, limited research has evaluated virtual reality (VR) display technologies, monocular and binocular depth cues, for human depth perception of volumetric (non-polygonal) datasets. In addition, a lack of standardization of three-dimensional (3D) user interfaces (UI) makes it challenging to interact with many VE systems. To address these issues, this dissertation focuses on evaluation of effects of stereoscopic and head-coupled displays on depth judgment of volumetric dataset. It also focuses on evaluation of a two-handed view manipulation techniques which support simultaneous 7 degree-of-freedom (DOF) navigation (x,y,z + yaw,pitch,roll + scale) in a multi-scale virtual environment (MSVE). Furthermore, this dissertation evaluates auto-adjustment of stereo view parameters techniques for stereoscopic fusion problems in a MSVE. Next, this dissertation presents a bimanual, hybrid user interface which combines traditional tracking devices with computer-vision based "natural" 3D inputs for multi-dimensional visualization in a semi-immersive desktop VR system. In conclusion, this dissertation provides a guideline for research design for evaluating UI and interaction techniques

    Spatial Interaction for Immersive Mixed-Reality Visualizations

    Get PDF
    Growing amounts of data, both in personal and professional settings, have caused an increased interest in data visualization and visual analytics. Especially for inherently three-dimensional data, immersive technologies such as virtual and augmented reality and advanced, natural interaction techniques have been shown to facilitate data analysis. Furthermore, in such use cases, the physical environment often plays an important role, both by directly influencing the data and by serving as context for the analysis. Therefore, there has been a trend to bring data visualization into new, immersive environments and to make use of the physical surroundings, leading to a surge in mixed-reality visualization research. One of the resulting challenges, however, is the design of user interaction for these often complex systems. In my thesis, I address this challenge by investigating interaction for immersive mixed-reality visualizations regarding three core research questions: 1) What are promising types of immersive mixed-reality visualizations, and how can advanced interaction concepts be applied to them? 2) How does spatial interaction benefit these visualizations and how should such interactions be designed? 3) How can spatial interaction in these immersive environments be analyzed and evaluated? To address the first question, I examine how various visualizations such as 3D node-link diagrams and volume visualizations can be adapted for immersive mixed-reality settings and how they stand to benefit from advanced interaction concepts. For the second question, I study how spatial interaction in particular can help to explore data in mixed reality. There, I look into spatial device interaction in comparison to touch input, the use of additional mobile devices as input controllers, and the potential of transparent interaction panels. Finally, to address the third question, I present my research on how user interaction in immersive mixed-reality environments can be analyzed directly in the original, real-world locations, and how this can provide new insights. Overall, with my research, I contribute interaction and visualization concepts, software prototypes, and findings from several user studies on how spatial interaction techniques can support the exploration of immersive mixed-reality visualizations.Zunehmende Datenmengen, sowohl im privaten als auch im beruflichen Umfeld, fĂŒhren zu einem zunehmenden Interesse an Datenvisualisierung und visueller Analyse. Insbesondere bei inhĂ€rent dreidimensionalen Daten haben sich immersive Technologien wie Virtual und Augmented Reality sowie moderne, natĂŒrliche Interaktionstechniken als hilfreich fĂŒr die Datenanalyse erwiesen. DarĂŒber hinaus spielt in solchen AnwendungsfĂ€llen die physische Umgebung oft eine wichtige Rolle, da sie sowohl die Daten direkt beeinflusst als auch als Kontext fĂŒr die Analyse dient. Daher gibt es einen Trend, die Datenvisualisierung in neue, immersive Umgebungen zu bringen und die physische Umgebung zu nutzen, was zu einem Anstieg der Forschung im Bereich Mixed-Reality-Visualisierung gefĂŒhrt hat. Eine der daraus resultierenden Herausforderungen ist jedoch die Gestaltung der Benutzerinteraktion fĂŒr diese oft komplexen Systeme. In meiner Dissertation beschĂ€ftige ich mich mit dieser Herausforderung, indem ich die Interaktion fĂŒr immersive Mixed-Reality-Visualisierungen im Hinblick auf drei zentrale Forschungsfragen untersuche: 1) Was sind vielversprechende Arten von immersiven Mixed-Reality-Visualisierungen, und wie können fortschrittliche Interaktionskonzepte auf sie angewendet werden? 2) Wie profitieren diese Visualisierungen von rĂ€umlicher Interaktion und wie sollten solche Interaktionen gestaltet werden? 3) Wie kann rĂ€umliche Interaktion in diesen immersiven Umgebungen analysiert und ausgewertet werden? Um die erste Frage zu beantworten, untersuche ich, wie verschiedene Visualisierungen wie 3D-Node-Link-Diagramme oder Volumenvisualisierungen fĂŒr immersive Mixed-Reality-Umgebungen angepasst werden können und wie sie von fortgeschrittenen Interaktionskonzepten profitieren. FĂŒr die zweite Frage untersuche ich, wie insbesondere die rĂ€umliche Interaktion bei der Exploration von Daten in Mixed Reality helfen kann. Dabei betrachte ich die Interaktion mit rĂ€umlichen GerĂ€ten im Vergleich zur Touch-Eingabe, die Verwendung zusĂ€tzlicher mobiler GerĂ€te als Controller und das Potenzial transparenter Interaktionspanels. Um die dritte Frage zu beantworten, stelle ich schließlich meine Forschung darĂŒber vor, wie Benutzerinteraktion in immersiver Mixed-Reality direkt in der realen Umgebung analysiert werden kann und wie dies neue Erkenntnisse liefern kann. Insgesamt trage ich mit meiner Forschung durch Interaktions- und Visualisierungskonzepte, Software-Prototypen und Ergebnisse aus mehreren Nutzerstudien zu der Frage bei, wie rĂ€umliche Interaktionstechniken die Erkundung von immersiven Mixed-Reality-Visualisierungen unterstĂŒtzen können

    Applications of Virtual Reality

    Get PDF
    Information Technology is growing rapidly. With the birth of high-resolution graphics, high-speed computing and user interaction devices Virtual Reality has emerged as a major new technology in the mid 90es, last century. Virtual Reality technology is currently used in a broad range of applications. The best known are games, movies, simulations, therapy. From a manufacturing standpoint, there are some attractive applications including training, education, collaborative work and learning. This book provides an up-to-date discussion of the current research in Virtual Reality and its applications. It describes the current Virtual Reality state-of-the-art and points out many areas where there is still work to be done. We have chosen certain areas to cover in this book, which we believe will have potential significant impact on Virtual Reality and its applications. This book provides a definitive resource for wide variety of people including academicians, designers, developers, educators, engineers, practitioners, researchers, and graduate students

    Immersive design engineering

    Get PDF
    Design Engineering is an innovative field that usually combines a number of disciplines, such as material science, mechanics, electronics, and/or biochemistry, etc. New immersive technologies, such as Virtual Reality (VR) and Augmented Reality (AR), are currently in the process of being widely adapted in various engineering fields. It is a proven fact that the modeling of spatial structures is supported by immersive exploration. But the field of Design Engineering reaches beyond standard engineering tasks. With this review paper we want to achieve the following: define the term “Immersive Design Engineering”, discuss a number of recent immersive technologies in this context, and provide an inspiring overview of work that belongs to, or is related to the field of Immersive Design Engineering. Finally, the paper concludes with definitions of research questions as well as a number of suggestions for future developments
    • 

    corecore