6 research outputs found

    Haptic-GeoZui3D: Exploring the Use of Haptics in AUV Path Planning

    Get PDF
    We have developed a desktop virtual reality system that we call Haptic-GeoZui3D, which brings together 3D user interaction and visualization to provide a compelling environment for AUV path planning. A key component in our system is the PHANTOM haptic device (SensAble Technologies, Inc.), which affords a sense of touch and force feedback – haptics – to provide cues and constraints to guide the user’s interaction. This paper describes our system, and how we use haptics to significantly augment our ability to lay out a vehicle path. We show how our system works well for quickly defining simple waypoint-towaypoint (e.g. transit) path segments, and illustrate how it could be used in specifying more complex, highly segmented (e.g. lawnmower survey) paths

    Interaction in an immersive virtual Beijing courtyard house

    Get PDF
    Courtyard housing had been a standard dwelling type in China for more than 3000 years, which integrated tightly with local customs, aesthetics, philosophy, and natural conditions. As the representative of Chinese courtyard housing, Beijing\u27s style has its unique features including structure, plan layout, and urban form. How to present these features effectively is of great importance to understand Beijing courtyard housing. The current major visualization methods in architecture include physical model, digital imaging, and hand drawing. All of them have two common limitations--small dimensions and non-interaction. As an alternative, VR owns two advantages--immersion and interactivity. In a full-immersive VR environment, such as the C6, users can examine virtual buildings at full-scale and operate models interactively at real-time. Thus, this project attempts to implement an interactive simulation of Beijing courtyard house in C6, and find out if architectural knowledge can be presented through this environment. The methodological steps include VR modeling, interaction planning, and C6 implementation. A four-yard house in Beijing was used as the prototype of VR modeling. By generating the model into six versions with different nodes and textures, it was found that the fewer nodes a model has, the quicker it is in C6. The main interaction mechanism is to demonstrate the main hall\u27s structure interactively through menu selection. The sequence to show the structure is based on its constructional process. Each menu item uses the name of structural components, and by clicking a menu item, the corresponding constructional step is shown in C6. There were five viewers invited to see the simulation and comment on the functionality of full-immersion and interactivity in this product. Overall, the results are positive that the full-immersive and interactive VR environment is potentially effective to present architectural knowledge. A major suggestion from the viewers is that more details can be added in the simulation, such as characters and furniture. Upon the accomplishment of this project, a method to implement architectural simulations efficiently in C6 could be found. In the future, this study could involve more complex interactions such as virtual inhabitants, as a means to show the Chinese culture vividly

    A study of haptic linear and pie menus in a 3D fish tank VR environment

    No full text
    New pop-up menu styles such as pie menus and marking menus have proven to be measurably faster and more accurate in mouse and pen-based interfaces. These characteristics suggest that they may also be useful for 3D haptically enhanced environments. This paper reports on our development and evaluation of a pie menu style and two types of linear menu styles. All utilize force to aid the user in option selection and activation. Our evaluation results show that selection using the pie menu is considerably faster and more accurate than both types of linear menu. Selection using push-through or exceed border methods was found to be superior to conventional button selection. We conclude that pop-up pie menus, with the right combination of selection method and assistive forces, can provide an excellent solution to providing menu choices in 3D haptic environments and that considering speed accuracy tradeoffs is important in making design decisions. 1

    Expressive feedback from virtual buttons

    Get PDF
    The simple action of pressing a button is a multimodal interaction with an interesting depth of complexity. As the development of computer interfaces to support 3D tasks evolves, there is a need to better understand how users will interact with virtual buttons that generate feedback from multiple sensory modalities. This research examined the effects of visual, auditory, and haptic feedback from virtual buttons on task performance dialing phone numbers and on the motion of individual buttons presses. This research also presents a theoretical framework for virtual button feedback and a model of virtual button feedback that includes touch feedback hysteresis. The results suggest that although haptic feedback alone was not enough to prevent participants from pressing the button farther than necessary, bimodal and trimodal feedback combinations that included haptic feedback shortened the depth of the presses. However, the shallower presses observed during trimodal feedback may have led to a counterintuitive increase in the number of digits that the participants omitted during the task. Even though interaction with virtual buttons may appear simple, it is important to understand the complexities behind the multimodal interaction because users will seek out the multimodal interactions they prefer

    Exploring 3D User Interface Technologies for Improving the Gaming Experience

    Get PDF
    3D user interface technologies have the potential to make games more immersive & engaging and thus potentially provide a better user experience to gamers. Although 3D user interface technologies are available for games, it is still unclear how their usage affects game play and if there are any user performance benefits. A systematic study of these technologies in game environments is required to understand how game play is affected and how we can optimize the usage in order to achieve better game play experience. This dissertation seeks to improve the gaming experience by exploring several 3DUI technologies. In this work, we focused on stereoscopic 3D viewing (to improve viewing experience) coupled with motion based control, head tracking (to make games more engaging), and faster gesture based menu selection (to reduce cognitive burden associated with menu interaction while playing). We first studied each of these technologies in isolation to understand their benefits for games. We present the results of our experiments to evaluate benefits of stereoscopic 3D (when coupled with motion based control) and head tracking in games. We discuss the reasons behind these findings and provide recommendations for game designers who want to make use of these technologies to enhance gaming experiences. We also present the results of our experiments with finger-based menu selection techniques with an aim to find out the fastest technique. Based on these findings, we custom designed an air-combat game prototype which simultaneously uses stereoscopic 3D, head tracking, and finger-count shortcuts to prove that these technologies could be useful for games if the game is designed with these technologies in mind. Additionally, to enhance depth discrimination and minimize visual discomfort, the game dynamically optimizes stereoscopic 3D parameters (convergence and separation) based on the user\u27s look direction. We conducted a within subjects experiment where we examined performance data and self-reported data on users perception of the game. Our results indicate that participants performed significantly better when all the 3DUI technologies (stereoscopic 3D, head-tracking and finger-count gestures) were available simultaneously with head tracking as a dominant factor. We explore the individual contribution of each of these technologies to the overall gaming experience and discuss the reasons behind our findings. Our experiments indicate that 3D user interface technologies could make gaming experience better if used effectively. The games must be designed to make use of the 3D user interface technologies available in order to provide a better gaming experience to the user. We explored a few technologies as part of this work and obtained some design guidelines for future game designers. We hope that our work will serve as the framework for the future explorations of making games better using 3D user interface technologies

    Using proprioception to support menu item selection in virtual reality

    Get PDF
    Dissertation (MIS (Multimedia))--University of Pretoria, 2023.There is an abundance of literature that informs menu system design, specifically for the context of a two-dimensional flat monitor display. These guidelines that are used to inform menu system design used in two-dimensional flat monitor displays were reconsidered to identify criteria that can inform the design of a menu system used in a three-dimensional (3D) virtual environment that makes use of immersive virtual reality technology. Considering the immersive nature of such technologies, it can be hypothesized that proprioception, a sense used to establish awareness of objects and space in a physical environment, can be transferred into the virtual environment to guide menu item selection. Various properties of menu system design were investigated to identify properties that can be used together with proprioception to support menu item selection. Further investigation to understand the usage of proprioception in a 3D virtual environment revealed that spatial awareness and memory needs to be established first. Therefore, criteria that inform the design of menu item selection to be supported by proprioception needed to take this fact into consideration as well. Consequently, a menu system was designed and developed based on the identified criteria to test its feasibility to inform the design of a menu system in a 3D virtual environment that enables users to rely on non-visual senses to guide their selections. The system was designed and developed using commercially available hardware and software to ensure that the findings of this study can be accessible to the general public. The results of this study identified that participants were able to establish spatial awareness and develop familiarity with the 3D virtual environment, therefore enabling them to make use of proprioception, along with their visual senses and haptic feedback, to improve their ability to select menu items. The results also revealed that participants had varying levels of relying on visual guidance for menu item selection and that the varying levels of reliance were based on personal preference.Information ScienceMIS (Multimedia)UnrestrictedFaculty of Engineering, Built Environment and Information TechnologySDG-04:Quality Educatio
    corecore