798 research outputs found

    Tactile to vibrotactile sensory feedback interface for prosthetic hand users

    No full text
    The motivation of this research work is to provide a sense of embodiment to prosthetic users by supplementing their devices with sensory feedback to the residual upper arm. This sensory feedback replicates the tactile sensory system of glabrous skin that covers palm and flexor surfaces of fingers. In this work, we produced vibration patterns that will be perceived at the upper arm, according to signals obtained by a prosthetic finger when sliding across fabricated textured surfaces. This was done by transforming the signals to ‘on’ and ‘off’ pulses in the LabView environment and then forwarded to a data acquisition board to provide voltage signals to a vibration actuator. We implemented a novel frequency measurement procedure to maintain a vibration frequency of 250 Hz, which is the optimum frequency of the mechanoreceptors underneath the skin of the upper arm in detecting vibration. The outcome from this research work leads to optimistic possibility that a touch sensation that was previously lost could be restored to different parts of the body. This undoubtedly will increase users’ acceptance of the device as a part of their body due to its ‘lifelike’ quality

    Designing Tactile Interfaces for Abstract Interpersonal Communication, Pedestrian Navigation and Motorcyclists Navigation

    Get PDF
    The tactile medium of communication with users is appropriate for displaying information in situations where auditory and visual mediums are saturated. There are situations where a subject's ability to receive information through either of these channels is severely restricted by the environment they are in or through any physical impairments that the subject may have. In this project, we have focused on two groups of users who need sustained visual and auditory focus in their task: Soldiers on the battle field and motorcyclists. Soldiers on the battle field use their visual and auditory capabilities to maintain awareness of their environment to guard themselves from enemy assault. One of the major challenges to coordination in a hazardous environment is maintaining communication between team members while mitigating cognitive load. Compromise in communication between team members may result in mistakes that can adversely affect the outcome of a mission. We have built two vibrotactile displays, Tactor I and Tactor II, each with nine actuators arranged in a three-by-three matrix with differing contact areas that can represent a total of 511 shapes. We used two dimensions of tactile medium, shapes and waveforms, to represent verb phrases and evaluated ability of users to perceive verb phrases the tactile code. We evaluated the effectiveness of communicating verb phrases while the users were performing two tasks simultaneously. The results showed that performing additional visual task did not affect the accuracy or the time taken to perceive tactile codes. Another challenge in coordinating Soldiers on a battle field is navigating them to respective assembly areas. We have developed HaptiGo, a lightweight haptic vest that provides pedestrians both navigational intelligence and obstacle detection capabilities. HaptiGo consists of optimally-placed vibro-tactile sensors that utilize natural and small form factor interaction cues, thus emulating the sensation of being passively guided towards the intended direction. We evaluated HaptiGo and found that it was able to successfully navigate users with timely alerts of incoming obstacles without increasing cognitive load, thereby increasing their environmental awareness. Additionally, we show that users are able to respond to directional information without training. The needs of motorcyclists are di erent from those of Soldiers. Motorcyclists' need to maintain visual and auditory situational awareness at all times is crucial since they are highly exposed on the road. Route guidance systems, such as the Garmin, have been well tested on automobilists, but remain much less safe for use by motorcyclists. Audio/visual routing systems decrease motorcyclists' situational awareness and vehicle control, and thus increase the chances of an accident. To enable motorcyclists to take advantage of route guidance while maintaining situational awareness, we created HaptiMoto, a wearable haptic route guidance system. HaptiMoto uses tactile signals to encode the distance and direction of approaching turns, thus avoiding interference with audio/visual awareness. Evaluations show that HaptiMoto is intuitive for motorcyclists, and a safer alternative to existing solutions

    Somatic ABC's: A Theoretical Framework for Designing, Developing and Evaluating the Building Blocks of Touch-Based Information Delivery

    Get PDF
    abstract: Situations of sensory overload are steadily becoming more frequent as the ubiquity of technology approaches reality--particularly with the advent of socio-communicative smartphone applications, and pervasive, high speed wireless networks. Although the ease of accessing information has improved our communication effectiveness and efficiency, our visual and auditory modalities--those modalities that today's computerized devices and displays largely engage--have become overloaded, creating possibilities for distractions, delays and high cognitive load; which in turn can lead to a loss of situational awareness, increasing chances for life threatening situations such as texting while driving. Surprisingly, alternative modalities for information delivery have seen little exploration. Touch, in particular, is a promising candidate given that it is our largest sensory organ with impressive spatial and temporal acuity. Although some approaches have been proposed for touch-based information delivery, they are not without limitations including high learning curves, limited applicability and/or limited expression. This is largely due to the lack of a versatile, comprehensive design theory--specifically, a theory that addresses the design of touch-based building blocks for expandable, efficient, rich and robust touch languages that are easy to learn and use. Moreover, beyond design, there is a lack of implementation and evaluation theories for such languages. To overcome these limitations, a unified, theoretical framework, inspired by natural, spoken language, is proposed called Somatic ABC's for Articulating (designing), Building (developing) and Confirming (evaluating) touch-based languages. To evaluate the usefulness of Somatic ABC's, its design, implementation and evaluation theories were applied to create communication languages for two very unique application areas: audio described movies and motor learning. These applications were chosen as they presented opportunities for complementing communication by offloading information, typically conveyed visually and/or aurally, to the skin. For both studies, it was found that Somatic ABC's aided the design, development and evaluation of rich somatic languages with distinct and natural communication units.Dissertation/ThesisPh.D. Computer Science 201

    Crossmodal audio and tactile interaction with mobile touchscreens

    Get PDF
    Touchscreen mobile devices often use cut-down versions of desktop user interfaces placing high demands on the visual sense that may prove awkward in mobile settings. The research in this thesis addresses the problems encountered by situationally impaired mobile users by using crossmodal interaction to exploit the abundant similarities between the audio and tactile modalities. By making information available to both senses, users can receive the information in the most suitable way, without having to abandon their primary task to look at the device. This thesis begins with a literature review of related work followed by a definition of crossmodal icons. Two icons may be considered to be crossmodal if and only if they provide a common representation of data, which is accessible interchangeably via different modalities. Two experiments investigated possible parameters for use in crossmodal icons with results showing that rhythm, texture and spatial location are effective. A third experiment focused on learning multi-dimensional crossmodal icons and the extent to which this learning transfers between modalities. The results showed identification rates of 92% for three-dimensional audio crossmodal icons when trained in the tactile equivalents, and identification rates of 89% for tactile crossmodal icons when trained in the audio equivalent. Crossmodal icons were then incorporated into a mobile touchscreen QWERTY keyboard. Experiments showed that keyboards with audio or tactile feedback produce fewer errors and greater speeds of text entry compared to standard touchscreen keyboards. The next study examined how environmental variables affect user performance with the same keyboard. The data showed that each modality performs differently with varying levels of background noise or vibration and the exact levels at which these performance decreases occur were established. The final study involved a longitudinal evaluation of a touchscreen application, CrossTrainer, focusing on longitudinal effects on performance with audio and tactile feedback, the impact of context on performance and personal modality preference. The results show that crossmodal audio and tactile icons are a valid method of presenting information to situationally impaired mobile touchscreen users with recognitions rates of 100% over time. This thesis concludes with a set of guidelines on the design and application of crossmodal audio and tactile feedback to enable application and interface designers to employ such feedback in all systems

    Using wrist vibrations to guide hand movement and whole body navigation

    Get PDF
    International audienceIn the absence of vision, mobility and orientation are challenging. Audio and tactile feedback can be used to guide visually impaired people. In this paper, we present two complementary studies on the use of vibrational cues for hand guidance during the exploration of itineraries on a map, and whole body-guidance in a virtual environment. Concretely, we designed wearable Arduino bracelets integrating a vibratory motor producing multiple patterns of pulses. In a first study, this bracelet was used for guiding the hand along unknown routes on an interactive tactile map. A wizard-of-Oz study with six blindfolded participants showed that tactons, vibrational patterns, may be more efficient than audio cues for indicating directions. In a second study, this bracelet was used by blindfolded participants to navigate in a virtual environment. The results presented here show that it is possible to significantly decrease travel distance with vibrational cues. To sum up, these preliminary but complementary studies suggest the interest of vibrational feedback in assistive technology for mobility and orientation for blind people

    State of the art review on walking support system for visually impaired people

    Get PDF
    The technology for terrain detection and walking support system for blind people has rapidly been improved the last couple of decades but to assist visually impaired people may have started long ago. Currently, a variety of portable or wearable navigation system is available in the market to help the blind for navigating their way in his local or remote area. The focused category in this work can be subgroups as electronic travel aids (ETAs), electronic orientation aids (EOAs) and position locator devices (PLDs). However, we will focus mainly on electronic travel aids (ETAs). This paper presents a comparative survey among the various portable or wearable walking support systems as well as informative description (a subcategory of ETAs or early stages of ETAs) with its working principal advantages and disadvantages so that the researchers can easily get the current stage of assisting blind technology along with the requirement for optimising the design of walking support system for its users

    Kinesthetic Cues that Lead the Way

    Get PDF

    Vibrotactile Warnings Design for Improving Risks Awareness in Construction Environment

    Get PDF
    Construction workers have difficulty identifying potential risks in harsh environments because traditional visual and acoustical alerts are inefficient. This study investigated a new communication method with a wearable tactile-based system to improve worker’s hazard perception. Three experiments are reported in relation to this system. The first experiment exploited VR as an experimental tool to compare auditory and vibrotactile warning signals as well as their combination in a simulated construction working environment. Findings demonstrated that the vibrotactile cues induced faster response times and higher affective ratings than auditory alarms, and their combination provided the shortest reaction time. The second experiment compared 7 different vibrotactile patterns varying in intensity, duration, and interval, to identify configurations that led to a higher degree of awareness. The third experiment validated the effectiveness of three selected tactons for delivering information on 3 hazard levels, finding that subjects could identify three-parameter signals with relatively low error. Our findings provide guidelines for designing tactile warning signals, which could help improve hazard recognition and risk perception, especially in construction sites
    corecore