812 research outputs found

    Constructing sonified haptic line graphs for the blind student: first steps

    Get PDF
    Line graphs stand as an established information visualisation and analysis technique taught at various levels of difficulty according to standard Mathematics curricula. It has been argued that blind individuals cannot use line graphs as a visualisation and analytic tool because they currently primarily exist in the visual medium. The research described in this paper aims at making line graphs accessible to blind students through auditory and haptic media. We describe (1) our design space for representing line graphs, (2) the technology we use to develop our prototypes and (3) the insights from our preliminary work

    Enabling audio-haptics

    Get PDF
    This thesis deals with possible solutions to facilitate orientation, navigation and overview of non-visual interfaces and virtual environments with the help of sound in combination with force-feedback haptics. Applications with haptic force-feedback, s

    Doctor of Philosophy

    Get PDF
    dissertationVirtual environments provide a consistent and relatively inexpensive method of training individuals. They often include haptic feedback in the form of forces applied to a manipulandum or thimble to provide a more immersive and educational experience. However, the limited haptic feedback provided in these systems tends to be restrictive and frustrating to use. Providing tactile feedback in addition to this kinesthetic feedback can enhance the user's ability to manipulate and interact with virtual objects while providing a greater level of immersion. This dissertation advances the state-of-the-art by providing a better understanding of tactile feedback and advancing combined tactilekinesthetic systems. The tactile feedback described within this dissertation is provided by a finger-mounted device called the contact location display (CLD). Rather than displaying the entire contact surface, the device displays (feeds back) information only about the center of contact between the user's finger and a virtual surface. In prior work, the CLD used specialized two-dimensional environments to provide smooth tactile feedback. Using polygonal environments would greatly enhance the device's usefulness. However, the surface discontinuities created by the facets on these models are rendered through the CLD, regardless of traditional force shading algorithms. To address this issue, a haptic shading algorithm was developed to provide smooth tactile and kinesthetic interaction with general polygonal models. Two experiments were used to evaluate the shading algorithm. iv To better understand the design requirements of tactile devices, three separate experiments were run to evaluate the perception thresholds for cue localization, backlash, and system delay. These experiments establish quantitative design criteria for tactile devices. These results can serve as the maximum (i.e., most demanding) device specifications for tactile-kinesthetic haptic systems where the user experiences tactile feedback as a function of his/her limb motions. Lastly, a revision of the CLD was constructed and evaluated. By taking the newly evaluated design criteria into account, the CLD device became smaller and lighter weight, while providing a full two degree-of-freedom workspace that covers the bottom hemisphere of the finger. Two simple manipulation experiments were used to evaluate the new CLD device

    Sonic Interactions in Virtual Environments

    Get PDF
    This open access book tackles the design of 3D spatial interactions in an audio-centered and audio-first perspective, providing the fundamental notions related to the creation and evaluation of immersive sonic experiences. The key elements that enhance the sensation of place in a virtual environment (VE) are: Immersive audio: the computational aspects of the acoustical-space properties of Virutal Reality (VR) technologies Sonic interaction: the human-computer interplay through auditory feedback in VE VR systems: naturally support multimodal integration, impacting different application domains Sonic Interactions in Virtual Environments will feature state-of-the-art research on real-time auralization, sonic interaction design in VR, quality of the experience in multimodal scenarios, and applications. Contributors and editors include interdisciplinary experts from the fields of computer science, engineering, acoustics, psychology, design, humanities, and beyond. Their mission is to shape an emerging new field of study at the intersection of sonic interaction design and immersive media, embracing an archipelago of existing research spread in different audio communities and to increase among the VR communities, researchers, and practitioners, the awareness of the importance of sonic elements when designing immersive environments

    Ambient Intelligence for Next-Generation AR

    Full text link
    Next-generation augmented reality (AR) promises a high degree of context-awareness - a detailed knowledge of the environmental, user, social and system conditions in which an AR experience takes place. This will facilitate both the closer integration of the real and virtual worlds, and the provision of context-specific content or adaptations. However, environmental awareness in particular is challenging to achieve using AR devices alone; not only are these mobile devices' view of an environment spatially and temporally limited, but the data obtained by onboard sensors is frequently inaccurate and incomplete. This, combined with the fact that many aspects of core AR functionality and user experiences are impacted by properties of the real environment, motivates the use of ambient IoT devices, wireless sensors and actuators placed in the surrounding environment, for the measurement and optimization of environment properties. In this book chapter we categorize and examine the wide variety of ways in which these IoT sensors and actuators can support or enhance AR experiences, including quantitative insights and proof-of-concept systems that will inform the development of future solutions. We outline the challenges and opportunities associated with several important research directions which must be addressed to realize the full potential of next-generation AR.Comment: This is a preprint of a book chapter which will appear in the Springer Handbook of the Metavers

    A white paper: NASA virtual environment research, applications, and technology

    Get PDF
    Research support for Virtual Environment technology development has been a part of NASA's human factors research program since 1985. Under the auspices of the Office of Aeronautics and Space Technology (OAST), initial funding was provided to the Aerospace Human Factors Research Division, Ames Research Center, which resulted in the origination of this technology. Since 1985, other Centers have begun using and developing this technology. At each research and space flight center, NASA missions have been major drivers of the technology. This White Paper was the joint effort of all the Centers which have been involved in the development of technology and its applications to their unique missions. Appendix A is the list of those who have worked to prepare the document, directed by Dr. Cynthia H. Null, Ames Research Center, and Dr. James P. Jenkins, NASA Headquarters. This White Paper describes the technology and its applications in NASA Centers (Chapters 1, 2 and 3), the potential roles it can take in NASA (Chapters 4 and 5), and a roadmap of the next 5 years (FY 1994-1998). The audience for this White Paper consists of managers, engineers, scientists and the general public with an interest in Virtual Environment technology. Those who read the paper will determine whether this roadmap, or others, are to be followed

    Designing for Mixed Reality Urban Exploration

    Get PDF
    This paper introduces a design framework for mixed reality urban exploration (MRUE), based on a concrete implementation in a historical city. The framework integrates different modalities, such as virtual reality (VR), augmented reality (AR), and haptics-audio interfaces, as well as advanced features such as personalized recommendations, social exploration, and itinerary management. It permits to address a number of concerns regarding information overload, safety, and quality of the experience, which are not sufficiently tackled in traditional non-integrated approaches. This study presents an integrated mobile platform built on top of this framework and reflects on the lessons learned.Peer reviewe

    Sonic Interactions in Virtual Environments

    Get PDF
    This open access book tackles the design of 3D spatial interactions in an audio-centered and audio-first perspective, providing the fundamental notions related to the creation and evaluation of immersive sonic experiences. The key elements that enhance the sensation of place in a virtual environment (VE) are: Immersive audio: the computational aspects of the acoustical-space properties of Virutal Reality (VR) technologies Sonic interaction: the human-computer interplay through auditory feedback in VE VR systems: naturally support multimodal integration, impacting different application domains Sonic Interactions in Virtual Environments will feature state-of-the-art research on real-time auralization, sonic interaction design in VR, quality of the experience in multimodal scenarios, and applications. Contributors and editors include interdisciplinary experts from the fields of computer science, engineering, acoustics, psychology, design, humanities, and beyond. Their mission is to shape an emerging new field of study at the intersection of sonic interaction design and immersive media, embracing an archipelago of existing research spread in different audio communities and to increase among the VR communities, researchers, and practitioners, the awareness of the importance of sonic elements when designing immersive environments

    Designing for Mixed Reality Urban Exploration

    Get PDF
    This paper introduces a design framework for mixed reality urban exploration (MRUE), based on a concrete implementation in a historical city. The framework integrates different modalities, such as virtual reality (VR), augmented reality (AR), and haptics-audio interfaces, as well as advanced features such as personalized recommendations, social exploration, and itinerary management. It permits to address a number of concerns regarding information overload, safety, and quality of the experience, which are not sufficiently tackled in traditional non-integrated approaches. This study presents an integrated mobile platform built on top of this framework and reflects on the lessons learned

    Sensory Communication

    Get PDF
    Contains table of contents for Section 2, an introduction and reports on fourteen research projects.National Institutes of Health Grant RO1 DC00117National Institutes of Health Grant RO1 DC02032National Institutes of Health/National Institute on Deafness and Other Communication Disorders Grant R01 DC00126National Institutes of Health Grant R01 DC00270National Institutes of Health Contract N01 DC52107U.S. Navy - Office of Naval Research/Naval Air Warfare Center Contract N61339-95-K-0014U.S. Navy - Office of Naval Research/Naval Air Warfare Center Contract N61339-96-K-0003U.S. Navy - Office of Naval Research Grant N00014-96-1-0379U.S. Air Force - Office of Scientific Research Grant F49620-95-1-0176U.S. Air Force - Office of Scientific Research Grant F49620-96-1-0202U.S. Navy - Office of Naval Research Subcontract 40167U.S. Navy - Office of Naval Research/Naval Air Warfare Center Contract N61339-96-K-0002National Institutes of Health Grant R01-NS33778U.S. Navy - Office of Naval Research Grant N00014-92-J-184
    • …
    corecore