5 research outputs found

    Head-mounted Sensory Augmentation Device: Comparing Haptic and Audio Modality

    Get PDF
    This paper investigates and compares the effectiveness of haptic and audio modality for navigation in low visibility environment using a sensory augmentation device. A second generation head-mounted vibrotactile interface as a sensory augmentation prototype was developed to help users to navigate in such environments. In our experiment, a subject navigates along a wall relying on the haptic or audio feedbacks as navigation commands. Haptic/audio feedback is presented to the subjects according to the information measured from the walls to a set of 12 ultrasound sensors placed around a helmet and a classification algorithm by using multilayer perceptron neural network. Results showed the haptic modality leads to significantly lower route deviation in navigation compared to auditory feedback. Furthermore, the NASA TLX questionnaire showed that subjects reported lower cognitive workload with haptic modality although both modalities were able to navigate the users along the wall

    Head-mounted Sensory Augmentation Device: Designing a Tactile Language

    Get PDF
    Abstract—Sensory augmentation operates by synthesizing new information then displaying it through an existing sensory channel and can be used to help people with impaired sensing or to assist in tasks where sensory information is limited or sparse, for example, when navigating in a low visibility environment. This paper presents the design of a 2nd generation head-mounted vibrotactile interface as a sensory augmentation prototype designed to present navigation commands that are intuitive, informative and minimize information overload. We describe an experiment in a structured environment in which the user navigates along a virtual wall whilst the position and orientation of the user’s head is tracked in real time by a motion capture system. Navigation commands in the form of vibrotactile feedback are presented according to the user’s distance from the virtual wall and their head orientation. We test the four possible combinations of two command presentation modes (continuous, discrete) and two command types (recurring, single). We evaluated the effectiveness of this ‘tactile language’ according to the users’ walking speed and the smoothness of their trajectory parallel to the virtual wall. Results showed that recurring continuous commands allowed users to navigate with lowest route deviation and highest walking speed. In addition, subjects preferred recurring continuous commands over other commands

    Towards a wearable interface for immersive telepresence in robotics

    Get PDF
    In this paper we present an architecture for the study of telepresence, immersion and human-robot interaction. The architecture is built around a wearable interface that provides the human user with visual, audio and tactile feedback from a remote location. We have chosen to interface the system with the iCub humanoid robot, as it mimics many human sensory modalities, including vision (with gaze control) and tactile feedback, which offers a richly immersive experience for the human user. Our wearable interface allows human participants to observe and explore a remote location, while also being able to communicate verbally with others located in the remote environment. Our approach has been tested from a variety of distances, including university and business premises, and using wired, wireless and Internet based connections, using data compression to maintain the quality of the experience for the user. Initial testing has shown the wearable interface to be a robust system of immersive teleoperation, with a myriad of potential applications, particularly in social networking, gaming and entertainment

    Tactile Language for a Head-Mounted Sensory Augmentation Device

    Get PDF
    Sensory augmentation is one of the most exciting domains for research in human-machine biohybridicity. The current paper presents the design of a 2nd generation vibrotactile helmet as a sensory augmentation prototype that is being developed to help users to navigate in low visibility environments. The paper outlines a study in which the user navigates along a virtual wall whilst the position and orientation of the user’s head is tracked by a motion capture system. Vibrotactile feedback is presented according to the user’s distance from the virtual wall and their head orientation. The research builds on our previous work by developing a simplified “tactile language” for communicating navigation commands. A key goal is to identify language tokens suitable to a head-mounted tactile interface that are maximally informative, minimize information overload, intuitive, and that have the potential to become ‘experientially transparent
    corecore