1,222 research outputs found

    Inviwo -- A Visualization System with Usage Abstraction Levels

    Full text link
    The complexity of today's visualization applications demands specific visualization systems tailored for the development of these applications. Frequently, such systems utilize levels of abstraction to improve the application development process, for instance by providing a data flow network editor. Unfortunately, these abstractions result in several issues, which need to be circumvented through an abstraction-centered system design. Often, a high level of abstraction hides low level details, which makes it difficult to directly access the underlying computing platform, which would be important to achieve an optimal performance. Therefore, we propose a layer structure developed for modern and sustainable visualization systems allowing developers to interact with all contained abstraction levels. We refer to this interaction capabilities as usage abstraction levels, since we target application developers with various levels of experience. We formulate the requirements for such a system, derive the desired architecture, and present how the concepts have been exemplary realized within the Inviwo visualization system. Furthermore, we address several specific challenges that arise during the realization of such a layered architecture, such as communication between different computing platforms, performance centered encapsulation, as well as layer-independent development by supporting cross layer documentation and debugging capabilities

    Virtual reality for 3D histology: multi-scale visualization of organs with interactive feature exploration

    Get PDF
    Virtual reality (VR) enables data visualization in an immersive and engaging manner, and it can be used for creating ways to explore scientific data. Here, we use VR for visualization of 3D histology data, creating a novel interface for digital pathology. Our contribution includes 3D modeling of a whole organ and embedded objects of interest, fusing the models with associated quantitative features and full resolution serial section patches, and implementing the virtual reality application. Our VR application is multi-scale in nature, covering two object levels representing different ranges of detail, namely organ level and sub-organ level. In addition, the application includes several data layers, including the measured histology image layer and multiple representations of quantitative features computed from the histology. In this interactive VR application, the user can set visualization properties, select different samples and features, and interact with various objects. In this work, we used whole mouse prostates (organ level) with prostate cancer tumors (sub-organ objects of interest) as example cases, and included quantitative histological features relevant for tumor biology in the VR model. Due to automated processing of the histology data, our application can be easily adopted to visualize other organs and pathologies from various origins. Our application enables a novel way for exploration of high-resolution, multidimensional data for biomedical research purposes, and can also be used in teaching and researcher training

    Immersive analytics for oncology patient cohorts

    Get PDF
    This thesis proposes a novel interactive immersive analytics tool and methods to interrogate the cancer patient cohort in an immersive virtual environment, namely Virtual Reality to Observe Oncology data Models (VROOM). The overall objective is to develop an immersive analytics platform, which includes a data analytics pipeline from raw gene expression data to immersive visualisation on virtual and augmented reality platforms utilising a game engine. Unity3D has been used to implement the visualisation. Work in this thesis could provide oncologists and clinicians with an interactive visualisation and visual analytics platform that helps them to drive their analysis in treatment efficacy and achieve the goal of evidence-based personalised medicine. The thesis integrates the latest discovery and development in cancer patients’ prognoses, immersive technologies, machine learning, decision support system and interactive visualisation to form an immersive analytics platform of complex genomic data. For this thesis, the experimental paradigm that will be followed is in understanding transcriptomics in cancer samples. This thesis specifically investigates gene expression data to determine the biological similarity revealed by the patient's tumour samples' transcriptomic profiles revealing the active genes in different patients. In summary, the thesis contributes to i) a novel immersive analytics platform for patient cohort data interrogation in similarity space where the similarity space is based on the patient's biological and genomic similarity; ii) an effective immersive environment optimisation design based on the usability study of exocentric and egocentric visualisation, audio and sound design optimisation; iii) an integration of trusted and familiar 2D biomedical visual analytics methods into the immersive environment; iv) novel use of the game theory as the decision-making system engine to help the analytics process, and application of the optimal transport theory in missing data imputation to ensure the preservation of data distribution; and v) case studies to showcase the real-world application of the visualisation and its effectiveness

    Implementation of Virtual Reality (VR) simulators in Norwegian maritime pilotage training

    Get PDF
    With millions of tons of cargo transported to and from Norwegian ports every year, the maritime waterways in Norway are heavily used. The high consequences of accidents and mishaps require well-trained seafarers and safe operating practices. The normal crews of vessels are supported by the Norwegian Coastal Administration (NCA) pilot service when operating vessels not meeting specific regulations. Simulator training is used as part of the toolset designed to educate, train, and advance the knowledge of maritime pilots in order to improve their operability. The NCA is working on an internal project to distribute Virtual Reality (VR) simulators to selected pilot stations along the coast and train and familiarize maritime pilots with the tool. There has been a lack of research on virtual reality simulators and how they are implemented in maritime organizations. The goal of this research is to see if a VR-simulator can be used as a training tool within the Norwegian Coastal Administration's pilot service. Furthermore, the findings of this study contribute to the understanding of VR-simulators in the field of Maritime Education and Training (MET). The thesis is addressing two research questions: 1. Is the Virtual Reality training useful in the competence development process of Norwegian maritime pilots? 2. How can the Virtual Reality simulators improve training outcomes of today’s maritime pilot education? The data gathered from the systematic literature review corresponds to the findings of the interviews. Considering the similarities with previous study findings from sectors such as healthcare, construction, and education, it is concluded that the results of the interviews can be generalized. For maritime pilots, the simulator offers recurrent scenario-based training and a high level of immersion. Pilots can learn at home, onboard a vessel, at the pilot station, and in group settings thanks to the system's mobility and user-friendliness. In terms of motivation and training effectiveness, the study finds that VR-simulators are effective and beneficial. The technology received positive reviews from the pilots. The simulator can be used to teach both novice and experienced maritime pilots about new operations, larger tonnage, and new operational areas, according to the findings of the research. After the NCA has utilized VR-simulators for some time, additional research may analyze the success of VR-simulators using a training evaluation study and investigate the impact of VR-training in the organization

    Advanced Visualization and Intuitive User Interface Systems for Biomedical Applications

    Get PDF
    Modern scientific research produces data at rates that far outpace our ability to comprehend and analyze it. Such sources include medical imaging data and computer simulations, where technological advancements and spatiotemporal resolution generate increasing amounts of data from each scan or simulation. A bottleneck has developed whereby medical professionals and researchers are unable to fully use the advanced information available to them. By integrating computer science, computer graphics, artistic ability and medical expertise, scientific visualization of medical data has become a new field of study. The objective of this thesis is to develop two visualization systems that use advanced visualization, natural user interface technologies and the large amount of biomedical data available to produce results that are of clinical utility and overcome the data bottleneck that has developed. Computational Fluid Dynamics (CFD) is a tool used to study the quantities associated with the movement of blood by computer simulation. We developed methods of processing spatiotemporal CFD data and displaying it in stereoscopic 3D with the ability to spatially navigate through the data. We used this method with two sets of display hardware: a full-scale visualization environment and a small-scale desktop system. The advanced display and data navigation abilities provide the user with the means to better understand the relationship between the vessel\u27s form and function. Low-cost 3D, depth-sensing cameras capture and process user body motion to recognize motions and gestures. Such devices allow users to use hand motions as an intuitive interface to computer applications. We developed algorithms to process and prepare the biomedical and scientific data for use with a custom control application. The application interprets user gestures as commands to a visualization tool and allows the user to control the visualization of multi-dimensional data. The intuitive interface allows the user to control the visualization of data without manual contact with an interaction device. In developing these methods and software tools we have leveraged recent trends in advanced visualization and intuitive interfaces in order to efficiently visualize biomedical data in such a way that provides meaningful information that can be used to further appreciate it

    Increasing awareness of climate change with immersive virtual reality

    Get PDF
    Previous research has shown that immersive virtual reality (VR) is a suitable tool for visualizing the consequences of climate change. The aim of the present study was to investigate whether visualization in VR has a stronger influence on climate change awareness and environmental attitudes compared to traditional media. Furthermore, it was examined how realistic a VR experience has to be in order to have an effect. The VR experience consisted of a model of the Aletsch glacier (Switzerland) melting over the course of 220 years. Explicit measurements (new environmental paradigm NEP, climate change scepticism, and nature relatedness) and an implicit measurement (implicit association test) were collected before and after the VR intervention and compared to three different non-VR control conditions (video, images with text, and plain text). In addition, the VR environment was varied in terms of degrees of realism and sophistication (3 conditions: abstract visualization, less sophisticated realistic visualization, more sophisticated realistic visualization). The six experimental conditions (3 VR conditions, three control conditions) were modeled as mixed effects, with VR versus control used as a fixed effect in a mixed effects modeling framework. Across all six conditions, environmental awareness (NEP) was higher after the participants (N = 142) had been confronted with the glacier melting, while no differences were found for nature relatedness and climate change scepticism before and after the interventions. There was no significant difference between VR and control conditions for any of the four measurements. Nevertheless, contrast analyses revealed that environmental awareness increased significantly only for the VR but not for the control conditions, suggesting that VR is more likely to lead to attitude change. Our results show that exposure to VR environments successfully increased environmental awareness independently of the design choices, suggesting that even abstract and less sophisticated VR environment designs may be sufficient to increase pro-environmental attitudes
    corecore