3 research outputs found

    Multisensory analytics for several interconnected scalar fields analysis

    Get PDF
    In our day it's hard to imagine modern research in sophisticated scientific area without use of visual analytics tools. However, humans are able to receive and analyze information about world around them not only through their eyes, but with help of other sensory stimuli as well and there is no reason why scientific data analyses should be an exception. As a result, there is a lot of research done recently in area of multimodal interfaces that are characterized by various sensory stimuli involvement and especially in visual-auditory interfaces development. In this work we introduce multisensory analytics approach, discuss theoretical issues, introduce main concepts of software tools implementing our approach, describe one of the possible auditory-visual mappings methods for particular case of scalar fields analyses - simultaneous interconnected scalar fields analyses and give practical examples

    Multi-Modal Perceptualization of Volumetric Data and Its Application to Molecular Docking

    No full text
    1

    Stereoscopic bimanual interaction for 3D visualization

    Get PDF
    Virtual Environments (VE) are being widely used in various research fields for several decades such as 3D visualization, education, training and games. VEs have the potential to enhance the visualization and act as a general medium for human-computer interaction (HCI). However, limited research has evaluated virtual reality (VR) display technologies, monocular and binocular depth cues, for human depth perception of volumetric (non-polygonal) datasets. In addition, a lack of standardization of three-dimensional (3D) user interfaces (UI) makes it challenging to interact with many VE systems. To address these issues, this dissertation focuses on evaluation of effects of stereoscopic and head-coupled displays on depth judgment of volumetric dataset. It also focuses on evaluation of a two-handed view manipulation techniques which support simultaneous 7 degree-of-freedom (DOF) navigation (x,y,z + yaw,pitch,roll + scale) in a multi-scale virtual environment (MSVE). Furthermore, this dissertation evaluates auto-adjustment of stereo view parameters techniques for stereoscopic fusion problems in a MSVE. Next, this dissertation presents a bimanual, hybrid user interface which combines traditional tracking devices with computer-vision based "natural" 3D inputs for multi-dimensional visualization in a semi-immersive desktop VR system. In conclusion, this dissertation provides a guideline for research design for evaluating UI and interaction techniques
    corecore