101 research outputs found

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use

    Electronic systems for the restoration of the sense of touch in upper limb prosthetics

    Get PDF
    In the last few years, research on active prosthetics for upper limbs focused on improving the human functionalities and the control. New methods have been proposed for measuring the user muscle activity and translating it into the prosthesis control commands. Developing the feed-forward interface so that the prosthesis better follows the intention of the user is an important step towards improving the quality of life of people with limb amputation. However, prosthesis users can neither feel if something or someone is touching them over the prosthesis and nor perceive the temperature or roughness of objects. Prosthesis users are helped by looking at an object, but they cannot detect anything otherwise. Their sight gives them most information. Therefore, to foster the prosthesis embodiment and utility, it is necessary to have a prosthetic system that not only responds to the control signals provided by the user, but also transmits back to the user the information about the current state of the prosthesis. This thesis presents an electronic skin system to close the loop in prostheses towards the restoration of the sense of touch in prosthesis users. The proposed electronic skin system inlcudes an advanced distributed sensing (electronic skin), a system for (i) signal conditioning, (ii) data acquisition, and (iii) data processing, and a stimulation system. The idea is to integrate all these components into a myoelectric prosthesis. Embedding the electronic system and the sensing materials is a critical issue on the way of development of new prostheses. In particular, processing the data, originated from the electronic skin, into low- or high-level information is the key issue to be addressed by the embedded electronic system. Recently, it has been proved that the Machine Learning is a promising approach in processing tactile sensors information. Many studies have been shown the Machine Learning eectiveness in the classication of input touch modalities.More specically, this thesis is focused on the stimulation system, allowing the communication of a mechanical interaction from the electronic skin to prosthesis users, and the dedicated implementation of algorithms for processing tactile data originating from the electronic skin. On system level, the thesis provides design of the experimental setup, experimental protocol, and of algorithms to process tactile data. On architectural level, the thesis proposes a design ow for the implementation of digital circuits for both FPGA and integrated circuits, and techniques for the power management of embedded systems for Machine Learning algorithms

    Integration of sight, hearing and touch in human cerebral cortex

    Get PDF
    While each individual sensory modality provides us with information about a specific aspect about our environment, our senses must be integrated for us to interact with the environment in a meaningful way. My thesis describes studies of the interactions between somatosensation, vision and audition using functional Magnetic Resonance Imaging (fMRI) of normal human subjects as the primary method. In order to study somatosensation with fMRI we first built an MRI-compatible tactile-stimulation apparatus. This apparatus was then used for four separate studies. In the first study, we investigated tactile responses in lateral occipital lobe, a brain region traditionally considered "visual" cortex. We found that visual area MST, but not visual area MT, responded to tactile stimulation. In the second study we investigated a possible homologue to a macaque multisensory area that integrates visual, auditory and tactile information, called the Superior Temporal Polysensory area (STP). We found responses to tactile stimuli co-localized with auditory and visual responses in posterior superior temporal sulcus. This is likely to be a human homologue to macaque STP. In the third study, we used Multi Voxel Pattern Analysis (MVPA) to demonstrate that this homologue of macaque STP (along with traditional "somatosensory" areas) can predict the location of tactile stimulation from fMRI data. In the fourth study we used psychophysical techniques to analyze the effects of auditory stimuli on tactile perception. We found that auditory stimuli can influence detection, frequency perception, and the perception of the spatial location of vibrotactile stimuli. Two additional projects are also briefly described. The results of an effort to develop an MRI compatible Transcranial Magnetic Stimulation (TMS) device are included. Also a project I worked on during my summer internship in which I debugged a system capable of both stimulating and recording from cortical tissue at the same time is also discussed

    Tactile Displays for Pedestrian Navigation

    Get PDF
    Existing pedestrian navigation systems are mainly visual-based, sometimes with an addition of audio guidance. However, previous research has reported that visual-based navigation systems require a high level of cognitive efforts, contributing to errors and delays. Furthermore, in many situations a person’s visual and auditory channels may be compromised due to environmental factors or may be occupied by other important tasks. Some research has suggested that the tactile sense can effectively be used for interfaces to support navigation tasks. However, many fundamental design and usability issues with pedestrian tactile navigation displays are yet to be investigated. This dissertation investigates human-computer interaction aspects associated with the design of tactile pedestrian navigation systems. More specifically, it addresses the following questions: What may be appropriate forms of wearable devices? What types of spatial information should such systems provide to pedestrians? How do people use spatial information for different navigation purposes? How can we effectively represent such information via tactile stimuli? And how do tactile navigation systems perform? A series of empirical studies was carried out to (1) investigate the effects of tactile signal properties and manipulation on the human perception of spatial data, (2) find out the effective form of wearable displays for navigation tasks, and (3) explore a number of potential tactile representation techniques for spatial data, specifically representing directions and landmarks. Questionnaires and interviews were used to gather information on the use of landmarks amongst people navigating urban environments for different purposes. Analysis of the results of these studies provided implications for the design of tactile pedestrian navigation systems, which we incorporated in a prototype. Finally, field trials were carried out to evaluate the design and address usability issues and performance-related benefits and challenges. The thesis develops an understanding of how to represent spatial information via the tactile channel and provides suggestions for the design and implementation of tactile pedestrian navigation systems. In addition, the thesis classifies the use of various types of landmarks for different navigation purposes. These contributions are developed throughout the thesis building upon an integrated series of empirical studies.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    グローブ型振動触覚インタフェースを用いた触感生成のための認知モデルの構築

    Get PDF

    Tactile and Crossmodal Change Blindness and its Implications for Display Design.

    Full text link
    Data overload, especially in the visual channel, and associated breakdowns in monitoring already represent a major challenge in data-rich environments. One promising means of overcoming data overload is through the introduction of multimodal displays, i.e., displays which distribute information across various sensory channels (including vision, audition, and touch). This approach has been shown to be effective in offloading the overburdened visual channel and thus reduce data overload. However, the effectiveness of these displays may be compromised if their design does not take into consideration limitations of human perception and cognition. One important question is the extent to which the tactile modality is susceptible to change blindness. Change blindness refers to the failure to detect even large and expected changes when these changes coincide with a “transient” stimulus. To date, the phenomenon has been studied primarily in vision, but there is limited empirical evidence that the tactile modality may also be subject to change blindness. If confirmed, this raises concerns about the robustness of multimodal displays and their use. A series of research activities described in this dissertation sought to answer the following questions: (1) to what extent, and under what circumstances, is the sense of touch susceptible to change blindness, (2) does change blindness occur crossmodally between vision and touch, and (3) how effective are three different display types for overcoming these phenomena. The effect of transient type, transient duration, and task demands were also investigated in the context of Unmanned Aerial Vehicle (UAV) control, the selected domain of application. The findings confirmed the occurrence of intramodal tactile change blindness, but not crossmodal change blindness. Subsequently, three countermeasures to intramodal tactile change blindness were developed and evaluated. The design of these countermeasures focused on supporting four of the five steps required for change detection and was found to significantly improve performance compared to when there was no countermeasure in place. Overall, this research adds to the knowledge base in multimodal and redundant information processing and can inform the design of multimodal displays not only for UAV control, but also other complex, data-rich domains.PhDIndustrial & Operations EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/108870/1/salu_1.pd

    Engineering Data Compendium. Human Perception and Performance, Volume 1

    Get PDF
    The concept underlying the Engineering Data Compendium was the product an R and D program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design of military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by system designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is Volume 1, which contains sections on Visual Acquisition of Information, Auditory Acquisition of Information, and Acquisition of Information by Other Senses

    Multi-Sensory Interaction for Blind and Visually Impaired People

    Get PDF
    This book conveyed the visual elements of artwork to the visually impaired through various sensory elements to open a new perspective for appreciating visual artwork. In addition, the technique of expressing a color code by integrating patterns, temperatures, scents, music, and vibrations was explored, and future research topics were presented. A holistic experience using multi-sensory interaction acquired by people with visual impairment was provided to convey the meaning and contents of the work through rich multi-sensory appreciation. A method that allows people with visual impairments to engage in artwork using a variety of senses, including touch, temperature, tactile pattern, and sound, helps them to appreciate artwork at a deeper level than can be achieved with hearing or touch alone. The development of such art appreciation aids for the visually impaired will ultimately improve their cultural enjoyment and strengthen their access to culture and the arts. The development of this new concept aids ultimately expands opportunities for the non-visually impaired as well as the visually impaired to enjoy works of art and breaks down the boundaries between the disabled and the non-disabled in the field of culture and arts through continuous efforts to enhance accessibility. In addition, the developed multi-sensory expression and delivery tool can be used as an educational tool to increase product and artwork accessibility and usability through multi-modal interaction. Training the multi-sensory experiences introduced in this book may lead to more vivid visual imageries or seeing with the mind’s eye
    corecore