269 research outputs found

    Obstacle detection display for visually impaired:Coding of direction, distance, and height on a vibrotactile waist band

    Get PDF
    Electronic travel aids (ETAs) can potentially increase the safety and comfort of blind users by detecting and displaying obstacles outside the range of the white cane. In a series of experiments, we aim to balance the amount of information displayed and the comprehensibility of the information taking into account the risk of information overload. In Experiment 1, we investigate perception of compound signals displayed on a tactile vest while walking. The results confirm that the threat of information overload is clear and present. Tactile coding parameters that are sufficiently discriminable in isolation may not be so in compound signals and while walking and using the white cane. Horizontal tactor location is a strong coding parameter, and temporal pattern is the preferred secondary coding parameter. Vertical location is also possible as coding parameter but it requires additional tactors and makes the display hardware more complex and expensive and less user friendly. In Experiment 2, we investigate how we can off-load the tactile modality by mitigating part of the information to an auditory display. Off-loading the tactile modality through auditory presentation is possible, but this off-loading is limited and may result in a new threat of auditory overload. In addition, taxing the auditory channel may in turn interfere with other auditory cues from the environment. In Experiment 3, we off-load the tactile sense by reducing the amount of displayed information using several filter rules. The resulting design was evaluated in Experiment 4 with visually impaired users. Although they acknowledge the potential of the display, the added of the ETA as a whole also depends on its sensor and object recognition capabilities. We recommend to use not more than two coding parameters in a tactile compound message and apply filter rules to reduce the amount of obstacles to be displayed in an obstacle avoidance ETA.</p

    Haptic wearables as sensory replacement, sensory augmentation and trainer - a review

    Get PDF
    Sensory impairments decrease quality of life and can slow or hinder rehabilitation. Small, computationally powerful electronics have enabled the recent development of wearable systems aimed to improve function for individuals with sensory impairments. The purpose of this review is to synthesize current haptic wearable research for clinical applications involving sensory impairments. We define haptic wearables as untethered, ungrounded body worn devices that interact with skin directly or through clothing and can be used in natural environments outside a laboratory. Results of this review are categorized by degree of sensory impairment. Total impairment, such as in an amputee, blind, or deaf individual, involves haptics acting as sensory replacement; partial impairment, as is common in rehabilitation, involves haptics as sensory augmentation; and no impairment involves haptics as trainer. This review found that wearable haptic devices improved function for a variety of clinical applications including: rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss and hearing loss. Future haptic wearables development should focus on clinical needs, intuitive and multimodal haptic displays, low energy demands, and biomechanical compliance for long-term usage

    HapticHead - Augmenting Reality via Tactile Cues

    Get PDF
    Information overload is increasingly becoming a challenge in today's world. Humans have only a limited amount of attention to allocate between sensory channels and tend to miss or misjudge critical sensory information when multiple activities are going on at the same time. For example, people may miss the sound of an approaching car when walking across the street while looking at their smartphones. Some sensory channels may also be impaired due to congenital or acquired conditions. Among sensory channels, touch is often experienced as obtrusive, especially when it occurs unexpectedly. Since tactile actuators can simulate touch, targeted tactile stimuli can provide users of virtual reality and augmented reality environments with important information for navigation, guidance, alerts, and notifications. In this dissertation, a tactile user interface around the head is presented to relieve or replace a potentially impaired visual channel, called \emph{HapticHead}. It is a high-resolution, omnidirectional, vibrotactile display that presents general, 3D directional, and distance information through dynamic tactile patterns. The head is well suited for tactile feedback because it is sensitive to mechanical stimuli and provides a large spherical surface area that enables the display of precise 3D information and allows the user to intuitively rotate the head in the direction of a stimulus based on natural mapping. Basic research on tactile perception on the head and studies on various use cases of head-based tactile feedback are presented in this thesis. Several investigations and user studies have been conducted on (a) the funneling illusion and localization accuracy of tactile stimuli around the head, (b) the ability of people to discriminate between different tactile patterns on the head, (c) approaches to designing tactile patterns for complex arrays of actuators, (d) increasing the immersion and presence level of virtual reality applications, and (e) assisting people with visual impairments in guidance and micro-navigation. In summary, tactile feedback around the head was found to be highly valuable as an additional information channel in various application scenarios. Most notable is the navigation of visually impaired individuals through a micro-navigation obstacle course, which is an order of magnitude more accurate than the previous state-of-the-art, which used a tactile belt as a feedback modality. The HapticHead tactile user interface's ability to safely navigate people with visual impairments around obstacles and on stairs with a mean deviation from the optimal path of less than 6~cm may ultimately improve the quality of life for many people with visual impairments.Die InformationsĂŒberlastung wird in der heutigen Welt zunehmend zu einer Herausforderung. Der Mensch hat nur eine begrenzte Menge an Aufmerksamkeit, die er zwischen den SinneskanĂ€len aufteilen kann, und neigt dazu, kritische Sinnesinformationen zu verpassen oder falsch einzuschĂ€tzen, wenn mehrere AktivitĂ€ten gleichzeitig ablaufen. Zum Beispiel können Menschen das GerĂ€usch eines herannahenden Autos ĂŒberhören, wenn sie ĂŒber die Straße gehen und dabei auf ihr Smartphone schauen. Einige SinneskanĂ€le können auch aufgrund von angeborenen oder erworbenen Erkrankungen beeintrĂ€chtigt sein. Unter den SinneskanĂ€len wird BerĂŒhrung oft als aufdringlich empfunden, besonders wenn sie unerwartet auftritt. Da taktile Aktoren BerĂŒhrungen simulieren können, können gezielte taktile Reize den Benutzern von Virtual- und Augmented Reality Anwendungen wichtige Informationen fĂŒr die Navigation, FĂŒhrung, Warnungen und Benachrichtigungen liefern. In dieser Dissertation wird eine taktile BenutzeroberflĂ€che um den Kopf herum prĂ€sentiert, um einen möglicherweise beeintrĂ€chtigten visuellen Kanal zu entlasten oder zu ersetzen, genannt \emph{HapticHead}. Es handelt sich um ein hochauflösendes, omnidirektionales, vibrotaktiles Display, das allgemeine, 3D-Richtungs- und Entfernungsinformationen durch dynamische taktile Muster darstellt. Der Kopf eignet sich gut fĂŒr taktiles Feedback, da er empfindlich auf mechanische Reize reagiert und eine große sphĂ€rische OberflĂ€che bietet, die die Darstellung prĂ€ziser 3D-Informationen ermöglicht und es dem Benutzer erlaubt, den Kopf aufgrund der natĂŒrlichen Zuordnung intuitiv in die Richtung eines Reizes zu drehen. Grundlagenforschung zur taktilen Wahrnehmung am Kopf und Studien zu verschiedenen AnwendungsfĂ€llen von kopfbasiertem taktilem Feedback werden in dieser Arbeit vorgestellt. Mehrere Untersuchungen und Nutzerstudien wurden durchgefĂŒhrt zu (a) der Funneling Illusion und der Lokalisierungsgenauigkeit von taktilen Reizen am Kopf, (b) der FĂ€higkeit von Menschen, zwischen verschiedenen taktilen Mustern am Kopf zu unterscheiden, (c) AnsĂ€tzen zur Gestaltung taktiler Muster fĂŒr komplexe Arrays von Aktoren, (d) der Erhöhung des Immersions- und PrĂ€senzgrades von Virtual-Reality-Anwendungen und (e) der UnterstĂŒtzung von Menschen mit Sehbehinderungen bei der FĂŒhrung und Mikronavigation. Zusammenfassend wurde festgestellt, dass taktiles Feedback um den Kopf herum als zusĂ€tzlicher Informationskanal in verschiedenen Anwendungsszenarien sehr wertvoll ist. Am interessantesten ist die Navigation von sehbehinderten Personen durch einen Mikronavigations-Hindernisparcours, welche um eine GrĂ¶ĂŸenordnung prĂ€ziser ist als der bisherige Stand der Technik, der einen taktilen GĂŒrtel als Feedback-ModalitĂ€t verwendete. Die FĂ€higkeit der taktilen Benutzerschnittstelle HapticHead, Menschen mit Sehbehinderungen mit einer mittleren Abweichung vom optimalen Pfad von weniger als 6~cm sicher um Hindernisse und auf Treppen zu navigieren, kann letztendlich die LebensqualitĂ€t vieler Menschen mit Sehbehinderungen verbessern

    Head-mounted Sensory Augmentation Device: Designing a Tactile Language

    Get PDF
    Abstract—Sensory augmentation operates by synthesizing new information then displaying it through an existing sensory channel and can be used to help people with impaired sensing or to assist in tasks where sensory information is limited or sparse, for example, when navigating in a low visibility environment. This paper presents the design of a 2nd generation head-mounted vibrotactile interface as a sensory augmentation prototype designed to present navigation commands that are intuitive, informative and minimize information overload. We describe an experiment in a structured environment in which the user navigates along a virtual wall whilst the position and orientation of the user’s head is tracked in real time by a motion capture system. Navigation commands in the form of vibrotactile feedback are presented according to the user’s distance from the virtual wall and their head orientation. We test the four possible combinations of two command presentation modes (continuous, discrete) and two command types (recurring, single). We evaluated the effectiveness of this ‘tactile language’ according to the users’ walking speed and the smoothness of their trajectory parallel to the virtual wall. Results showed that recurring continuous commands allowed users to navigate with lowest route deviation and highest walking speed. In addition, subjects preferred recurring continuous commands over other commands

    Somatic ABC's: A Theoretical Framework for Designing, Developing and Evaluating the Building Blocks of Touch-Based Information Delivery

    Get PDF
    abstract: Situations of sensory overload are steadily becoming more frequent as the ubiquity of technology approaches reality--particularly with the advent of socio-communicative smartphone applications, and pervasive, high speed wireless networks. Although the ease of accessing information has improved our communication effectiveness and efficiency, our visual and auditory modalities--those modalities that today's computerized devices and displays largely engage--have become overloaded, creating possibilities for distractions, delays and high cognitive load; which in turn can lead to a loss of situational awareness, increasing chances for life threatening situations such as texting while driving. Surprisingly, alternative modalities for information delivery have seen little exploration. Touch, in particular, is a promising candidate given that it is our largest sensory organ with impressive spatial and temporal acuity. Although some approaches have been proposed for touch-based information delivery, they are not without limitations including high learning curves, limited applicability and/or limited expression. This is largely due to the lack of a versatile, comprehensive design theory--specifically, a theory that addresses the design of touch-based building blocks for expandable, efficient, rich and robust touch languages that are easy to learn and use. Moreover, beyond design, there is a lack of implementation and evaluation theories for such languages. To overcome these limitations, a unified, theoretical framework, inspired by natural, spoken language, is proposed called Somatic ABC's for Articulating (designing), Building (developing) and Confirming (evaluating) touch-based languages. To evaluate the usefulness of Somatic ABC's, its design, implementation and evaluation theories were applied to create communication languages for two very unique application areas: audio described movies and motor learning. These applications were chosen as they presented opportunities for complementing communication by offloading information, typically conveyed visually and/or aurally, to the skin. For both studies, it was found that Somatic ABC's aided the design, development and evaluation of rich somatic languages with distinct and natural communication units.Dissertation/ThesisPh.D. Computer Science 201

    Navigation behavior design and representations for a people aware mobile robot system

    Get PDF
    There are millions of robots in operation around the world today, and almost all of them operate on factory floors in isolation from people. However, it is now becoming clear that robots can provide much more value assisting people in daily tasks in human environments. Perhaps the most fundamental capability for a mobile robot is navigating from one location to another. Advances in mapping and motion planning research in the past decades made indoor navigation a commodity for mobile robots. Yet, questions remain on how the robots should move around humans. This thesis advocates the use of semantic maps and spatial rules of engagement to enable non-expert users to effortlessly interact with and control a mobile robot. A core concept explored in this thesis is the Tour Scenario, where the task is to familiarize a mobile robot to a new environment after it is first shipped and unpacked in a home or office setting. During the tour, the robot follows the user and creates a semantic representation of the environment. The user labels objects, landmarks and locations by performing pointing gestures and using the robot's user interface. The spatial semantic information is meaningful to humans, as it allows providing commands to the robot such as ``bring me a cup from the kitchen table". While the robot is navigating towards the goal, it should not treat nearby humans as obstacles and should move in a socially acceptable manner. Three main navigation behaviors are studied in this work. The first behavior is the point-to-point navigation. The navigation planner presented in this thesis borrows ideas from human-human spatial interactions, and takes into account personal spaces as well as reactions of people who are in close proximity to the trajectory of the robot. The second navigation behavior is person following. After the description of a basic following behavior, a user study on person following for telepresence robots is presented. Additionally, situation awareness for person following is demonstrated, where the robot facilitates tasks by predicting the intent of the user and utilizing the semantic map. The third behavior is person guidance. A tour-guide robot is presented with a particular application for visually impaired users.Ph.D

    Relative vibrotactile spatial acuity of the torso

    Get PDF
    While tactile acuity for pressure has been extensively investigated, far less is known about acuity for vibrotactile stimulation. Vibrotactile acuity is important however, as such stimulation is used in many applications, including sensory substitution devices. We tested discrimination of vibrotactile stimulation from eccentric rotating mass motors with in-plane vibration. In 3 experiments, we tested gradually decreasing center-to-center (c/c) distances from 30 mm (experiment 1) to 13 mm (experiment 3). Observers judged whether a second vibrating stimulator (‘tactor’) was to the left or right or in the same place as a first one that came on 250 ms before the onset of the second (with a 50-ms inter-stimulus interval). The results show that while accuracy tends to decrease the closer the tactors are, discrimination accuracy is still well above chance for the smallest distance, which places the threshold for vibrotactile stimulation well below 13 mm, which is lower than recent estimates. The results cast new light on vibrotactile sensitivity and can furthermore be of use in the design of devices that convey information through vibrotactile stimulation.Peer Reviewe
    • 

    corecore