1,891 research outputs found

    Tactile Language for a Head-Mounted Sensory Augmentation Device

    Get PDF
    Sensory augmentation is one of the most exciting domains for research in human-machine biohybridicity. The current paper presents the design of a 2nd generation vibrotactile helmet as a sensory augmentation prototype that is being developed to help users to navigate in low visibility environments. The paper outlines a study in which the user navigates along a virtual wall whilst the position and orientation of the user’s head is tracked by a motion capture system. Vibrotactile feedback is presented according to the user’s distance from the virtual wall and their head orientation. The research builds on our previous work by developing a simplified “tactile language” for communicating navigation commands. A key goal is to identify language tokens suitable to a head-mounted tactile interface that are maximally informative, minimize information overload, intuitive, and that have the potential to become ‘experientially transparent

    Low-fi skin vision: A case study in rapid prototyping a sensory substitution system

    Get PDF
    We describe the design process we have used to develop a minimal, twenty vibration motor Tactile Vision Sensory Substitution (TVSS) system which enables blind-folded subjects to successfully track and bat a rolling ball and thereby experience 'skin vision'. We have employed a low-fi rapid prototyping approach to build this system and argue that this methodology is particularly effective for building embedded interactive systems. We support this argument in two ways. First, by drawing on theoretical insights from robotics, a discipline that also has to deal with the challenge of building complex embedded systems that interact with their environments; second, by using the development of our TVSS as a case study: describing the series of prototypes that led to our successful design and highlighting what we learnt at each stage

    Head-mounted Sensory Augmentation Device: Comparing Haptic and Audio Modality

    Get PDF
    This paper investigates and compares the effectiveness of haptic and audio modality for navigation in low visibility environment using a sensory augmentation device. A second generation head-mounted vibrotactile interface as a sensory augmentation prototype was developed to help users to navigate in such environments. In our experiment, a subject navigates along a wall relying on the haptic or audio feedbacks as navigation commands. Haptic/audio feedback is presented to the subjects according to the information measured from the walls to a set of 12 ultrasound sensors placed around a helmet and a classification algorithm by using multilayer perceptron neural network. Results showed the haptic modality leads to significantly lower route deviation in navigation compared to auditory feedback. Furthermore, the NASA TLX questionnaire showed that subjects reported lower cognitive workload with haptic modality although both modalities were able to navigate the users along the wall

    Sensory augmentation with distal touch: The tactile helmet project

    Get PDF
    The Tactile Helmet is designed to augment a wearer's senses with a long range sense of touch. Tactile specialist animals such as rats and mice are capable of rapidly acquiring detailed information about their environment from their whiskers by using task-sensitive strategies. Providing similar information about the nearby environment, in tactile form, to a human operator could prove invaluable for search and rescue operations, or for partially-sighted people. Two key aspects of the Tactile Helmet are sensory augmentation, and active sensing. A haptic display is used to provide the user with ultrasonic range information. This can be interpreted in addition to, rather than instead of, visual or auditory information. Active sensing systems "are purposive and information-seeking sensory systems, involving task specific control of the sensory apparatus" [1]. The integration of an accelerometer allows the device to actively gate the delivery of sensory information to the user, depending on their movement. Here we describe the hardware, sensory transduction and characterisation of the Tactile Helmet device, before outlining potential use cases and benefits of the system. © 2013 Springer-Verlag Berlin Heidelberg

    Head-mounted Sensory Augmentation Device: Designing a Tactile Language

    Get PDF
    Abstract—Sensory augmentation operates by synthesizing new information then displaying it through an existing sensory channel and can be used to help people with impaired sensing or to assist in tasks where sensory information is limited or sparse, for example, when navigating in a low visibility environment. This paper presents the design of a 2nd generation head-mounted vibrotactile interface as a sensory augmentation prototype designed to present navigation commands that are intuitive, informative and minimize information overload. We describe an experiment in a structured environment in which the user navigates along a virtual wall whilst the position and orientation of the user’s head is tracked in real time by a motion capture system. Navigation commands in the form of vibrotactile feedback are presented according to the user’s distance from the virtual wall and their head orientation. We test the four possible combinations of two command presentation modes (continuous, discrete) and two command types (recurring, single). We evaluated the effectiveness of this ‘tactile language’ according to the users’ walking speed and the smoothness of their trajectory parallel to the virtual wall. Results showed that recurring continuous commands allowed users to navigate with lowest route deviation and highest walking speed. In addition, subjects preferred recurring continuous commands over other commands

    Head-mounted Sensory Augmentation System for Navigation in Low Visibility Environments

    Get PDF
    Sensory augmentation can be used to assist in some tasks where sensory information is limited or sparse. This thesis focuses on the design and investigation of a head-mounted vibrotactile sensory augmentation interface to assist navigation in low visibility environments such as firefighters’ navigation or travel aids for visually impaired people. A novel head-mounted vibrotactile interface comprising a 1-by-7 vibrotactile display worn on the forehead is developed. A series of psychophysical studies is carried out with this display to (1) determine the vibrotactile absolute threshold, (2) investigate the accuracy of vibrotactile localization, and (3) evaluate the funneling illusion and apparent motion as sensory phenomena that could be used to communicate navigation signals. The results of these studies provide guidelines for the design of head-mounted interfaces. A 2nd generation head-mounted sensory augmentation interface called the Mark-II Tactile Helmet is developed for the application of firefighters’ navigation. It consists of a ring of ultrasound sensors mounted to the outside of a helmet, a microcontroller, two batteries and a refined vibrotactile display composed of seven vibration motors based on the results of the aforementioned psychophysical studies. A ‘tactile language’, that is, a set of distinguishable vibrotactile patterns, is developed for communicating navigation commands to the Mark-II Tactile Helmet. Four possible combinations of two command presentation modes (continuous, discrete) and two command types (recurring, single) are evaluated for their effectiveness in guiding users along a virtual wall in a structured environment. Continuous and discrete presentation modes use spatiotemporal patterns that induce the experience of apparent movement and discrete movement on the forehead, respectively. The recurring command type presents the tactile command repeatedly with an interval between patterns of 500 ms while the single command type presents the tactile command just once when there is a change in the command. The effectiveness of this tactile language is evaluated according to the objective measures of the users’ walking speed and the smoothness of their trajectory parallel to the virtual wall and subjective measures of utility and comfort employing Likert-type rating scales. The Recurring Continuous (RC) commands that exploit the phenomena of apparent motion are most effective in generating efficient routes and fast travel, and are most preferred. Finally, the optimal tactile language (RC) is compared with audio guidance using verbal instructions to investigate effectiveness in delivering navigation commands. The results show that haptic guidance leads to better performance as well as lower cognitive workload compared to auditory feedback. This research demonstrates that a head-mounted sensory augmentation interface can enhance spatial awareness in low visibility environments and could help firefighters’ navigation by providing them with supplementary sensory information

    Neuromorphic vibrotactile stimulation of fingertips for encoding object stiffness in telepresence sensory substitution and augmentation applications

    Get PDF
    We present a tactile telepresence system for real-time transmission of information about object stiffness to the human fingertips. Experimental tests were performed across two laboratories (Italy and Ireland). In the Italian laboratory, a mechatronic sensing platform indented different rubber samples. Information about rubber stiffness was converted into on-off events using a neuronal spiking model and sent to a vibrotactile glove in the Irish laboratory. Participants discriminated the variation of the stiffness of stimuli according to a two-alternative forced choice protocol. Stiffness discrimination was based on the variation of the temporal pattern of spikes generated during the indentation of the rubber samples. The results suggest that vibrotactile stimulation can effectively simulate surface stiffness when using neuronal spiking models to trigger vibrations in the haptic interface. Specifically, fractional variations of stiffness down to 0.67 were significantly discriminated with the developed neuromorphic haptic interface. This is a performance comparable, though slightly worse, to the threshold obtained in a benchmark experiment evaluating the same set of stimuli naturally with the own hand. Our paper presents a bioinspired method for delivering sensory feedback about object properties to human skin based on contingency-mimetic neuronal models, and can be useful for the design of high performance haptic devices

    Haptic wearables as sensory replacement, sensory augmentation and trainer - a review

    Get PDF
    Sensory impairments decrease quality of life and can slow or hinder rehabilitation. Small, computationally powerful electronics have enabled the recent development of wearable systems aimed to improve function for individuals with sensory impairments. The purpose of this review is to synthesize current haptic wearable research for clinical applications involving sensory impairments. We define haptic wearables as untethered, ungrounded body worn devices that interact with skin directly or through clothing and can be used in natural environments outside a laboratory. Results of this review are categorized by degree of sensory impairment. Total impairment, such as in an amputee, blind, or deaf individual, involves haptics acting as sensory replacement; partial impairment, as is common in rehabilitation, involves haptics as sensory augmentation; and no impairment involves haptics as trainer. This review found that wearable haptic devices improved function for a variety of clinical applications including: rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss and hearing loss. Future haptic wearables development should focus on clinical needs, intuitive and multimodal haptic displays, low energy demands, and biomechanical compliance for long-term usage

    Real virtuality: emerging technology for virtually recreating reality

    Get PDF
    corecore