4 research outputs found

    Exploring Spatial Interaction and Visualization Paradigms for 3D Cadastral Visualization

    Get PDF
    Effective visualization of spatial data, especially in the realm of 3D cadastral visualization, relies on the utilization of optimal interaction techniques and user interfaces for navigating complex datasets and understanding property delineations. This paper synthesizes findings from diverse studies investigating the efficacy of interaction modalities and user interfaces in 3D visualization across various domains. Drawing parallels to the broader field of 3D visualization, particularly in interaction tasks and user interface paradigms, this paper examines the potential for advancing 3D cadastral visualization systems. The study identifies fundamental interaction tasks crucial for effective 3D cadastral visualization, including object manipulation, widget manipulation, and data selection and annotation. It evaluates a range of user interfaces, from traditional input methods to emerging technologies such as gesture-based interfaces and virtual reality (VR) headsets, highlighting their respective strengths and limitations.Embracing insights from comparative analyses of immersive and non-immersive scenarios, this paper reveals significant insights into the effectiveness of immersive environments, such as virtual reality and augmented reality, in enhancing user experience and task performance for 3D cadastral visualization. Additionally, it aims to address key challenges associated with visualizing 3D cadastral data in immersive environments by proposing a comprehensive framework for evaluating the effectiveness and utility of immersive visualization for 3D cadastral purposes

    Augmenting Tactile 3D Data Navigation With Pressure Sensing

    Get PDF
    International audienceWe present a pressure-augmented tactile 3D data navigation technique, specifically designed for small devices, motivated by the need to support the interactive visualization beyond traditional workstations. While touch input has been studied extensively on large screens, current techniques do not scale to small and portable devices. We use phone-based pressure sensing with a binary mapping to separate interaction degrees of freedom (DOF) and thus allow users to easily select different manipulation schemes (e. g., users first perform only rotation and then with a simple pressure input to switch to translation). We compare our technique to traditional 3D-RST (rotation, scaling, translation) using a docking task in a controlled experiment. The results show that our technique increases the accuracy of interaction, with limited impact on speed. We discuss the implications for 3D interaction design and verify that our results extend to older devices with pseudo pressure and are valid in realistic phone usage scenarios
    corecore