8,492 research outputs found

    Multimodal “sensory illusions” for improving spatial awareness in virtual environments

    Get PDF
    Inaccurate judgement of distances in virtual environments (VEs) restricts their usefulness for engineering development, in which engineers must have a good understanding of the spaces they are designing. Multimodal feedback can improve depth perception in VEs, but this has yet to be implemented and tested in engineering applications with systems which provide haptic feedback to the body. The project reported in this paper will develop a multimodal VE to improve engineers’ understanding of 3D spaces. It will test the concept of “sensory illusions” where the point of collision in the VE differs to the point of haptic feedback on the body. This will permit the use of fewer vibrotactile devices and therefore the development of a more wearable system. This paper describes related work in multisensory and tactile stimulation which suggests that our perception of a stimulus is not fixed to the point of contact

    An Introduction to 3D User Interface Design

    Get PDF
    3D user interface design is a critical component of any virtual environment (VE) application. In this paper, we present a broad overview of three-dimensional (3D) interaction and user interfaces. We discuss the effect of common VE hardware devices on user interaction, as well as interaction techniques for generic 3D tasks and the use of traditional two-dimensional interaction styles in 3D environments. We divide most user interaction tasks into three categories: navigation, selection/manipulation, and system control. Throughout the paper, our focus is on presenting not only the available techniques, but also practical guidelines for 3D interaction design and widely held myths. Finally, we briefly discuss two approaches to 3D interaction design, and some example applications with complex 3D interaction requirements. We also present an annotated online bibliography as a reference companion to this article

    Artificial Tongue-Placed Tactile Biofeedback for perceptual supplementation: application to human disability and biomedical engineering

    Full text link
    The present paper aims at introducing the innovative technologies, based on the concept of "sensory substitution" or "perceptual supplementation", we are developing in the fields of human disability and biomedical engineering. Precisely, our goal is to design, develop and validate practical assistive biomedical and/technical devices and/or rehabilitating procedures for persons with disabilities, using artificial tongue-placed tactile biofeedback systems. Proposed applications are dealing with: (1) pressure sores prevention in case of spinal cord injuries (persons with paraplegia, or tetraplegia); (2) ankle proprioceptive acuity improvement for driving assistance in older and/or disabled adults; and (3) balance control improvement to prevent fall in older and/or disabled adults. This paper presents results of three feasibility studies performed on young healthy adults

    Multimodal "Sensory Illusions" for Improving Spatial Awareness in Virtual Environments

    Full text link
    Inaccurate judgement of distances in virtual environments (VEs) restricts their usefulness for engineering development, in which engineers must have a good understanding of the spaces they are designing. Multimodal feedback can improve depth perception in VEs, but this has yet to be implemented and tested in engineering applications with systems which provide haptic feedback to the body. The project reported in this paper will develop a multimodal VE to improve engineers’ understanding of 3D spaces. It will test the concept of “sensory illusions” where the point of collision in the VE differs to the point of haptic feedback on the body. This will permit the use of fewer vibrotactile devices and therefore the development of a more wearable system. This paper describes related work in multisensory and tactile stimulation which suggests that our perception of a stimulus is not fixed to the point of contact

    Crossmodal spatial location: initial experiments

    Get PDF
    This paper describes an alternative form of interaction for mobile devices using crossmodal output. The aim of our work is to investigate the equivalence of audio and tactile displays so that the same messages can be presented in one form or another. Initial experiments show that spatial location can be perceived as equivalent in both the auditory and tactile modalities Results show that participants are able to map presented 3D audio positions to tactile body positions on the waist most effectively when mobile and that there are significantly more errors made when using the ankle or wrist. This paper compares the results from both a static and mobile experiment on crossmodal spatial location and outlines the most effective ways to use this crossmodal output in a mobile context

    The Analysis of design and manufacturing tasks using haptic and immersive VR - Some case studies

    Get PDF
    The use of virtual reality in interactive design and manufacture has been researched extensively but the practical application of this technology in industry is still very much in its infancy. This is surprising as one would have expected that, after some 30 years of research commercial applications of interactive design or manufacturing planning and analysis would be widespread throughout the product design domain. One of the major but less well known advantages of VR technology is that logging the user gives a great deal of rich data which can be used to automatically generate designs or manufacturing instructions, analyse design and manufacturing tasks, map engineering processes and, tentatively, acquire expert knowledge. The authors feel that the benefits of VR in these areas have not been fully disseminated to the wider industrial community and - with the advent of cheaper PC-based VR solutions - perhaps a wider appreciation of the capabilities of this type of technology may encourage companies to adopt VR solutions for some of their product design processes. With this in mind, this paper will describe in detail applications of haptics in assembly demonstrating how user task logging can lead to the analysis of design and manufacturing tasks at a level of detail not previously possible as well as giving usable engineering outputs. The haptic 3D VR study involves the use of a Phantom and 3D system to analyse and compare this technology against real-world user performance. This work demonstrates that the detailed logging of tasks in a virtual environment gives considerable potential for understanding how virtual tasks can be mapped onto their real world equivalent as well as showing how haptic process plans can be generated in a similar manner to the conduit design and assembly planning HMD VR tool reported in PART A. The paper concludes with a view as to how the authors feel that the use of VR systems in product design and manufacturing should evolve in order to enable the industrial adoption of this technology in the future

    Using Wii technology to explore real spaces via virtual environments for people who are blind

    Get PDF
    Purpose - Virtual environments (VEs) that represent real spaces (RSs) give people who are blind the opportunity to build a cognitive map in advance that they will be able to use when arriving at the RS. Design - In this research study Nintendo Wii based technology was used for exploring VEs via the Wiici application. The Wiimote allows the user to interact with VEs by simulating walking and scanning the space. Finding - By getting haptic and auditory feedback the user learned to explore new spaces. We examined the participants' abilities to explore new simple and complex places, construct a cognitive map, and perform orientation tasks in the RS. Originality – To our knowledge, this finding presents the first virtual environment for people who are blind that allow the participants to scan the environment and by this to construct map model spatial representations

    Owning an overweight or underweight body: distinguishing the physical, experienced and virtual body

    Get PDF
    Our bodies are the most intimately familiar objects we encounter in our perceptual environment. Virtual reality provides a unique method to allow us to experience having a very different body from our own, thereby providing a valuable method to explore the plasticity of body representation. In this paper, we show that women can experience ownership over a whole virtual body that is considerably smaller or larger than their physical body. In order to gain a better understanding of the mechanisms underlying body ownership, we use an embodiment questionnaire, and introduce two new behavioral response measures: an affordance estimation task (indirect measure of body size) and a body size estimation task (direct measure of body size). Interestingly, after viewing the virtual body from first person perspective, both the affordance and the body size estimation tasks indicate a change in the perception of the size of the participant’s experienced body. The change is biased by the size of the virtual body (overweight or underweight). Another novel aspect of our study is that we distinguish between the physical, experienced and virtual bodies, by asking participants to provide affordance and body size estimations for each of the three bodies separately. This methodological point is important for virtual reality experiments investigating body ownership of a virtual body, because it offers a better understanding of which cues (e.g. visual, proprioceptive, memory, or a combination thereof) influence body perception, and whether the impact of these cues can vary between different setups
    • …
    corecore