2,691 research outputs found

    An Introduction to 3D User Interface Design

    Get PDF
    3D user interface design is a critical component of any virtual environment (VE) application. In this paper, we present a broad overview of three-dimensional (3D) interaction and user interfaces. We discuss the effect of common VE hardware devices on user interaction, as well as interaction techniques for generic 3D tasks and the use of traditional two-dimensional interaction styles in 3D environments. We divide most user interaction tasks into three categories: navigation, selection/manipulation, and system control. Throughout the paper, our focus is on presenting not only the available techniques, but also practical guidelines for 3D interaction design and widely held myths. Finally, we briefly discuss two approaches to 3D interaction design, and some example applications with complex 3D interaction requirements. We also present an annotated online bibliography as a reference companion to this article

    The design-by-adaptation approach to universal access: learning from videogame technology

    Get PDF
    This paper proposes an alternative approach to the design of universally accessible interfaces to that provided by formal design frameworks applied ab initio to the development of new software. This approach, design-byadaptation, involves the transfer of interface technology and/or design principles from one application domain to another, in situations where the recipient domain is similar to the host domain in terms of modelled systems, tasks and users. Using the example of interaction in 3D virtual environments, the paper explores how principles underlying the design of videogame interfaces may be applied to a broad family of visualization and analysis software which handles geographical data (virtual geographic environments, or VGEs). One of the motivations behind the current study is that VGE technology lags some way behind videogame technology in the modelling of 3D environments, and has a less-developed track record in providing the variety of interaction methods needed to undertake varied tasks in 3D virtual worlds by users with varied levels of experience. The current analysis extracted a set of interaction principles from videogames which were used to devise a set of 3D task interfaces that have been implemented in a prototype VGE for formal evaluation

    Stereoscopic bimanual interaction for 3D visualization

    Get PDF
    Virtual Environments (VE) are being widely used in various research fields for several decades such as 3D visualization, education, training and games. VEs have the potential to enhance the visualization and act as a general medium for human-computer interaction (HCI). However, limited research has evaluated virtual reality (VR) display technologies, monocular and binocular depth cues, for human depth perception of volumetric (non-polygonal) datasets. In addition, a lack of standardization of three-dimensional (3D) user interfaces (UI) makes it challenging to interact with many VE systems. To address these issues, this dissertation focuses on evaluation of effects of stereoscopic and head-coupled displays on depth judgment of volumetric dataset. It also focuses on evaluation of a two-handed view manipulation techniques which support simultaneous 7 degree-of-freedom (DOF) navigation (x,y,z + yaw,pitch,roll + scale) in a multi-scale virtual environment (MSVE). Furthermore, this dissertation evaluates auto-adjustment of stereo view parameters techniques for stereoscopic fusion problems in a MSVE. Next, this dissertation presents a bimanual, hybrid user interface which combines traditional tracking devices with computer-vision based "natural" 3D inputs for multi-dimensional visualization in a semi-immersive desktop VR system. In conclusion, this dissertation provides a guideline for research design for evaluating UI and interaction techniques

    Developing virtual watersheds for evaluating the dynamics of land use change

    Get PDF

    A multi-modal interface for road planning tasks using vision, haptics and sound

    Get PDF
    The planning of transportation infrastructure requires analyzing many different types of geo-spatial information in the form of maps. Displaying too many of these maps at the same time can lead to visual clutter or information overload, which results in sub-optimal effectiveness. Multimodal interfaces (MMIs) try to address this visual overload and improve the user\u27s interaction with large amounts of data by combining several sensory modalities. Previous research into MMIs seems to indicate that using multiple sensory modalities leads to more efficient human-computer interactions when used properly. The motivation from this previous work has lead to the creation of this thesis, which describes a novel GIS system for road planning using vision, haptics and sound. The implementation of this virtual environment is discussed, including some of the design decisions used when trying to ascertain how we map visual data to our other senses. A user study was performed to see how this type of system could be utilized, and the results of the study are presented

    Game-like 3D visualisation of air quality data

    Get PDF
    The data produced by sensor networks for urban air quality monitoring is becoming a valuable asset for informed health-aware human activity planning. However, in order to properly explore and exploit these data, citizens need intuitive and effective ways of interacting with it. This paper presents CityOnStats, a visualisation tool developed to provide users, mainly adults and young adults, with a game-like 3D environment populated with air quality sensing data, as an alternative to the traditionally passive visualisation techniques. CityOnStats provides several visual cues of pollution presence with the purpose of meeting each user’s preferences. Usability tests with a sample of 30 participants have shown the value of air quality 3D game-based visualisation and have provided empirical support for which visual cues are most adequate for the task at hand.info:eu-repo/semantics/publishedVersio

    Collaborative geographic visualization

    Get PDF
    Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia do Ambiente, perfil Gestão e Sistemas AmbientaisThe present document is a revision of essential references to take into account when developing ubiquitous Geographical Information Systems (GIS) with collaborative visualization purposes. Its chapters focus, respectively, on general principles of GIS, its multimedia components and ubiquitous practices; geo-referenced information visualization and its graphical components of virtual and augmented reality; collaborative environments, its technological requirements, architectural specificities, and models for collective information management; and some final considerations about the future and challenges of collaborative visualization of GIS in ubiquitous environment

    Multimodal and multidimensional geodata interaction and visualization

    Get PDF
    This PhD proposes the development of a Science Data Visualization System, SdVS, that analyzes and presents different kinds of visualizing and interacting techniques with Geo-data, in order to deal with knowledge about Geo-data using GoogleEarth. After that, we apply the archaeological data as a case study, and, as a result, we develop the Archaeological Visualization System, ArVS, using new visualization paradigms and Human-Computer-Interaction techniques based on SdVS. Furthermore, SdVS provides guidelines for developing any other visualization and interacting applications in the future, and how the users can use SdVS system to enhance the understanding and dissemination of knowledge

    An Augmented Interaction Strategy For Designing Human-Machine Interfaces For Hydraulic Excavators

    Get PDF
    Lack of adequate information feedback and work visibility, and fatigue due to repetition have been identified as the major usability gaps in the human-machine interface (HMI) design of modern hydraulic excavators that subject operators to undue mental and physical workload, resulting in poor performance. To address these gaps, this work proposed an innovative interaction strategy, termed “augmented interaction”, for enhancing the usability of the hydraulic excavator. Augmented interaction involves the embodiment of heads-up display and coordinated control schemes into an efficient, effective and safe HMI. Augmented interaction was demonstrated using a framework consisting of three phases: Design, Implementation/Visualization, and Evaluation (D.IV.E). Guided by this framework, two alternative HMI design concepts (Design A: featuring heads-up display and coordinated control; and Design B: featuring heads-up display and joystick controls) in addition to the existing HMI design (Design C: featuring monitor display and joystick controls) were prototyped. A mixed reality seating buck simulator, named the Hydraulic Excavator Augmented Reality Simulator (H.E.A.R.S), was used to implement the designs and simulate a work environment along with a rock excavation task scenario. A usability evaluation was conducted with twenty participants to characterize the impact of the new HMI types using quantitative (task completion time, TCT; and operating error, OER) and qualitative (subjective workload and user preference) metrics. The results indicated that participants had a shorter TCT with Design A. For OER, there was a lower error probability due to collisions (PER1) with Design A, and lower error probability due to misses (PER2)with Design B. The subjective measures showed a lower overall workload and a high preference for Design B. It was concluded that augmented interaction provides a viable solution for enhancing the usability of the HMI of a hydraulic excavator
    corecore