248 research outputs found

    Design of a vibrotactile stimulus paradigm for a biofeedback device to improve gait rehabilitation of lower limb amputees

    Get PDF
    Dissertação de mestrado integrado em Biomedical Engineering (specialization in Biomaterials Rehabilitation and Biomechanics)A lower limb amputation not only affects locomotion, but also the amputee's somatosensory system, body perception, and mental health and, naturally, the fear of falling is more pronounced. Consequently, the patient is faced with the challenge of developing motor strategies that allow him to carry out daily activities since the use of the prosthesis does not fully compensate for the deficiencies acquired by a prosthetic gait, such as, for instance, asymmetry and variation in the duration of the gait events. Faced with the absence of effective treatments that restore locomotor functionality, the BioWalk project presents a rehabilitation solution: a biofeedback system that assists amputees during gait training sessions. This system consists in applying a vibrotactile stimulus on the skin of the affected leg. This stimulus can be activated at different moments of the prosthetic gait, allowing the patient to have a better perception and awareness of his body and locomotion to be able to detect any abnormal motor behaviours during the rehabilitation sessions and, in the future, to establish an adequate and healthy gait pattern. Consequently, there is a need to analyse muscular and kinematic data of the gait of amputees to detect which events are critical in prosthetic gait, which muscles are activated or most required in gait, how the centre of mass behaves in the gait of an amputee, among other parameters. Thus, in this dissertation, the main goal is to investigate and propose the best way (i.e., paradigm) to apply a vibrotactile stimulus to be used in a biofeedback device during rehabilitation sessions.Uma amputação do membro inferior não afeta apenas a locomoção, mas também o sistema somatosensorial do amputado, a sua perceção corporal, a sua saúde mental e, naturalmente, o medo de cair encontra-se mais acentuado. Consequentemente, o paciente é confrontado com o desafio de desenvolver estratégias motoras que lhe permitam a realização de atividades diárias dado que o uso da prótese não compensa totalmente as deficiências adquiridas por uma marcha protética, como por exemplo, a assimetria e a variação na duração dos eventos de marcha. Perante a ausência de tratamentos eficazes que restaurem a funcionalidade locomotora, o projeto BioWalk apresenta uma solução de reabilitação: um sistema de biofeedback que auxilie a pessoa amputada durante sessões de treino de marcha. Este sistema consiste na aplicação de um estímulo vibrotátil sobre a pele da perna afetada. Este estímulo pode ser ativado em diversos momentos da marcha protética permitindo ao paciente uma melhor percetibilidade e consciência sobre o seu corpo e locomoção para que seja capaz de detetar algum comportamento motor anormal durante as sessões de reabilitação e para, futuramente, estabelecer um padrão de marcha adequado e saudável. Consequentemente, surge a necessidade de analisar dados musculares e cinemáticos da marcha de amputados de forma a detetar quais os eventos críticos na marcha protética, quais são os músculos ativados ou os que são mais requeridos na marcha, como se comporta o centro de massa na marcha de um amputado, entre outros parâmetros. Assim, nesta dissertação, o objetivo é propor um paradigma de estímulos vibrotáteis para serem usados num dispositivo de biofeedback durante sessões de reabilitação

    Obstacle detection display for visually impaired:Coding of direction, distance, and height on a vibrotactile waist band

    Get PDF
    Electronic travel aids (ETAs) can potentially increase the safety and comfort of blind users by detecting and displaying obstacles outside the range of the white cane. In a series of experiments, we aim to balance the amount of information displayed and the comprehensibility of the information taking into account the risk of information overload. In Experiment 1, we investigate perception of compound signals displayed on a tactile vest while walking. The results confirm that the threat of information overload is clear and present. Tactile coding parameters that are sufficiently discriminable in isolation may not be so in compound signals and while walking and using the white cane. Horizontal tactor location is a strong coding parameter, and temporal pattern is the preferred secondary coding parameter. Vertical location is also possible as coding parameter but it requires additional tactors and makes the display hardware more complex and expensive and less user friendly. In Experiment 2, we investigate how we can off-load the tactile modality by mitigating part of the information to an auditory display. Off-loading the tactile modality through auditory presentation is possible, but this off-loading is limited and may result in a new threat of auditory overload. In addition, taxing the auditory channel may in turn interfere with other auditory cues from the environment. In Experiment 3, we off-load the tactile sense by reducing the amount of displayed information using several filter rules. The resulting design was evaluated in Experiment 4 with visually impaired users. Although they acknowledge the potential of the display, the added of the ETA as a whole also depends on its sensor and object recognition capabilities. We recommend to use not more than two coding parameters in a tactile compound message and apply filter rules to reduce the amount of obstacles to be displayed in an obstacle avoidance ETA.</p

    Testing a Shape-Changing Haptic Navigation Device With Vision-Impaired and Sighted Audiences in an Immersive Theater Setting

    Get PDF
    Flatland was an immersive “in-the-wild” experimental theater and technology project, undertaken with the goal of developing systems that could assist “real-world” pedestrian navigation for both vision-impaired (VI) and sighted individuals, while also exploring inclusive and equivalent cultural experiences for VI and sighted audiences. A novel shape-changing handheld haptic navigation device, the “Animotus,” was developed. The device has the ability to modify its form in the user's grasp to communicate heading and proximity to navigational targets. Flatland provided a unique opportunity to comparatively study the use of novel navigation devices with a large group of individuals (79 sighted, 15 VI) who were primarily attending a theater production rather than an experimental study. In this paper, we present our findings on comparing the navigation performance (measured in terms of efficiency, average pace, and time facing targets) and opinions of VI and sighted users of the Animotus as they negotiated the 112 m2 production environment. Differences in navigation performance were nonsignificant across VI and sighted individuals and a similar range of opinions on device function and engagement spanned both groups. We believe more structured device familiarization, particularly for VI users, could improve performance and incorrect technology expectations (such as obstacle avoidance capability), which influenced overall opinion. This paper is intended to aid the development of future inclusive technologies and cultural experiences

    Designing Tactile Interfaces for Abstract Interpersonal Communication, Pedestrian Navigation and Motorcyclists Navigation

    Get PDF
    The tactile medium of communication with users is appropriate for displaying information in situations where auditory and visual mediums are saturated. There are situations where a subject's ability to receive information through either of these channels is severely restricted by the environment they are in or through any physical impairments that the subject may have. In this project, we have focused on two groups of users who need sustained visual and auditory focus in their task: Soldiers on the battle field and motorcyclists. Soldiers on the battle field use their visual and auditory capabilities to maintain awareness of their environment to guard themselves from enemy assault. One of the major challenges to coordination in a hazardous environment is maintaining communication between team members while mitigating cognitive load. Compromise in communication between team members may result in mistakes that can adversely affect the outcome of a mission. We have built two vibrotactile displays, Tactor I and Tactor II, each with nine actuators arranged in a three-by-three matrix with differing contact areas that can represent a total of 511 shapes. We used two dimensions of tactile medium, shapes and waveforms, to represent verb phrases and evaluated ability of users to perceive verb phrases the tactile code. We evaluated the effectiveness of communicating verb phrases while the users were performing two tasks simultaneously. The results showed that performing additional visual task did not affect the accuracy or the time taken to perceive tactile codes. Another challenge in coordinating Soldiers on a battle field is navigating them to respective assembly areas. We have developed HaptiGo, a lightweight haptic vest that provides pedestrians both navigational intelligence and obstacle detection capabilities. HaptiGo consists of optimally-placed vibro-tactile sensors that utilize natural and small form factor interaction cues, thus emulating the sensation of being passively guided towards the intended direction. We evaluated HaptiGo and found that it was able to successfully navigate users with timely alerts of incoming obstacles without increasing cognitive load, thereby increasing their environmental awareness. Additionally, we show that users are able to respond to directional information without training. The needs of motorcyclists are di erent from those of Soldiers. Motorcyclists' need to maintain visual and auditory situational awareness at all times is crucial since they are highly exposed on the road. Route guidance systems, such as the Garmin, have been well tested on automobilists, but remain much less safe for use by motorcyclists. Audio/visual routing systems decrease motorcyclists' situational awareness and vehicle control, and thus increase the chances of an accident. To enable motorcyclists to take advantage of route guidance while maintaining situational awareness, we created HaptiMoto, a wearable haptic route guidance system. HaptiMoto uses tactile signals to encode the distance and direction of approaching turns, thus avoiding interference with audio/visual awareness. Evaluations show that HaptiMoto is intuitive for motorcyclists, and a safer alternative to existing solutions

    Relative vibrotactile spatial acuity of the torso

    Get PDF
    While tactile acuity for pressure has been extensively investigated, far less is known about acuity for vibrotactile stimulation. Vibrotactile acuity is important however, as such stimulation is used in many applications, including sensory substitution devices. We tested discrimination of vibrotactile stimulation from eccentric rotating mass motors with in-plane vibration. In 3 experiments, we tested gradually decreasing center-to-center (c/c) distances from 30 mm (experiment 1) to 13 mm (experiment 3). Observers judged whether a second vibrating stimulator (‘tactor’) was to the left or right or in the same place as a first one that came on 250 ms before the onset of the second (with a 50-ms inter-stimulus interval). The results show that while accuracy tends to decrease the closer the tactors are, discrimination accuracy is still well above chance for the smallest distance, which places the threshold for vibrotactile stimulation well below 13 mm, which is lower than recent estimates. The results cast new light on vibrotactile sensitivity and can furthermore be of use in the design of devices that convey information through vibrotactile stimulation.Peer Reviewe

    Neuromorphic vibrotactile stimulation of fingertips for encoding object stiffness in telepresence sensory substitution and augmentation applications

    Get PDF
    We present a tactile telepresence system for real-time transmission of information about object stiffness to the human fingertips. Experimental tests were performed across two laboratories (Italy and Ireland). In the Italian laboratory, a mechatronic sensing platform indented different rubber samples. Information about rubber stiffness was converted into on-off events using a neuronal spiking model and sent to a vibrotactile glove in the Irish laboratory. Participants discriminated the variation of the stiffness of stimuli according to a two-alternative forced choice protocol. Stiffness discrimination was based on the variation of the temporal pattern of spikes generated during the indentation of the rubber samples. The results suggest that vibrotactile stimulation can effectively simulate surface stiffness when using neuronal spiking models to trigger vibrations in the haptic interface. Specifically, fractional variations of stiffness down to 0.67 were significantly discriminated with the developed neuromorphic haptic interface. This is a performance comparable, though slightly worse, to the threshold obtained in a benchmark experiment evaluating the same set of stimuli naturally with the own hand. Our paper presents a bioinspired method for delivering sensory feedback about object properties to human skin based on contingency-mimetic neuronal models, and can be useful for the design of high performance haptic devices

    Augmenting the Spatial Perception Capabilities of Users Who Are Blind

    Get PDF
    People who are blind face a series of challenges and limitations resulting from their lack of being able to see, forcing them to either seek the assistance of a sighted individual or work around the challenge by way of a inefficient adaptation (e.g. following the walls in a room in order to reach a door rather than walking in a straight line to the door). These challenges are directly related to blind users' lack of the spatial perception capabilities normally provided by the human vision system. In order to overcome these spatial perception related challenges, modern technologies can be used to convey spatial perception data through sensory substitution interfaces. This work is the culmination of several projects which address varying spatial perception problems for blind users. First we consider the development of non-visual natural user interfaces for interacting with large displays. This work explores the haptic interaction space in order to find useful and efficient haptic encodings for the spatial layout of items on large displays. Multiple interaction techniques are presented which build on prior research (Folmer et al. 2012), and the efficiency and usability of the most efficient of these encodings is evaluated with blind children. Next we evaluate the use of wearable technology in aiding navigation of blind individuals through large open spaces lacking tactile landmarks used during traditional white cane navigation. We explore the design of a computer vision application with an unobtrusive aural interface to minimize veering of the user while crossing a large open space. Together, these projects represent an exploration into the use of modern technology in augmenting the spatial perception capabilities of blind users

    A survey on hardware and software solutions for multimodal wearable assistive devices targeting the visually impaired

    Get PDF
    The market penetration of user-centric assistive devices has rapidly increased in the past decades. Growth in computational power, accessibility, and cognitive device capabilities have been accompanied by significant reductions in weight, size, and price, as a result of which mobile and wearable equipment are becoming part of our everyday life. In this context, a key focus of development has been on rehabilitation engineering and on developing assistive technologies targeting people with various disabilities, including hearing loss, visual impairments and others. Applications range from simple health monitoring such as sport activity trackers, through medical applications including sensory (e.g. hearing) aids and real-time monitoring of life functions, to task-oriented tools such as navigational devices for the blind. This paper provides an overview of recent trends in software and hardware-based signal processing relevant to the development of wearable assistive solutions

    HapticHead - Augmenting Reality via Tactile Cues

    Get PDF
    Information overload is increasingly becoming a challenge in today's world. Humans have only a limited amount of attention to allocate between sensory channels and tend to miss or misjudge critical sensory information when multiple activities are going on at the same time. For example, people may miss the sound of an approaching car when walking across the street while looking at their smartphones. Some sensory channels may also be impaired due to congenital or acquired conditions. Among sensory channels, touch is often experienced as obtrusive, especially when it occurs unexpectedly. Since tactile actuators can simulate touch, targeted tactile stimuli can provide users of virtual reality and augmented reality environments with important information for navigation, guidance, alerts, and notifications. In this dissertation, a tactile user interface around the head is presented to relieve or replace a potentially impaired visual channel, called \emph{HapticHead}. It is a high-resolution, omnidirectional, vibrotactile display that presents general, 3D directional, and distance information through dynamic tactile patterns. The head is well suited for tactile feedback because it is sensitive to mechanical stimuli and provides a large spherical surface area that enables the display of precise 3D information and allows the user to intuitively rotate the head in the direction of a stimulus based on natural mapping. Basic research on tactile perception on the head and studies on various use cases of head-based tactile feedback are presented in this thesis. Several investigations and user studies have been conducted on (a) the funneling illusion and localization accuracy of tactile stimuli around the head, (b) the ability of people to discriminate between different tactile patterns on the head, (c) approaches to designing tactile patterns for complex arrays of actuators, (d) increasing the immersion and presence level of virtual reality applications, and (e) assisting people with visual impairments in guidance and micro-navigation. In summary, tactile feedback around the head was found to be highly valuable as an additional information channel in various application scenarios. Most notable is the navigation of visually impaired individuals through a micro-navigation obstacle course, which is an order of magnitude more accurate than the previous state-of-the-art, which used a tactile belt as a feedback modality. The HapticHead tactile user interface's ability to safely navigate people with visual impairments around obstacles and on stairs with a mean deviation from the optimal path of less than 6~cm may ultimately improve the quality of life for many people with visual impairments.Die Informationsüberlastung wird in der heutigen Welt zunehmend zu einer Herausforderung. Der Mensch hat nur eine begrenzte Menge an Aufmerksamkeit, die er zwischen den Sinneskanälen aufteilen kann, und neigt dazu, kritische Sinnesinformationen zu verpassen oder falsch einzuschätzen, wenn mehrere Aktivitäten gleichzeitig ablaufen. Zum Beispiel können Menschen das Geräusch eines herannahenden Autos überhören, wenn sie über die Straße gehen und dabei auf ihr Smartphone schauen. Einige Sinneskanäle können auch aufgrund von angeborenen oder erworbenen Erkrankungen beeinträchtigt sein. Unter den Sinneskanälen wird Berührung oft als aufdringlich empfunden, besonders wenn sie unerwartet auftritt. Da taktile Aktoren Berührungen simulieren können, können gezielte taktile Reize den Benutzern von Virtual- und Augmented Reality Anwendungen wichtige Informationen für die Navigation, Führung, Warnungen und Benachrichtigungen liefern. In dieser Dissertation wird eine taktile Benutzeroberfläche um den Kopf herum präsentiert, um einen möglicherweise beeinträchtigten visuellen Kanal zu entlasten oder zu ersetzen, genannt \emph{HapticHead}. Es handelt sich um ein hochauflösendes, omnidirektionales, vibrotaktiles Display, das allgemeine, 3D-Richtungs- und Entfernungsinformationen durch dynamische taktile Muster darstellt. Der Kopf eignet sich gut für taktiles Feedback, da er empfindlich auf mechanische Reize reagiert und eine große sphärische Oberfläche bietet, die die Darstellung präziser 3D-Informationen ermöglicht und es dem Benutzer erlaubt, den Kopf aufgrund der natürlichen Zuordnung intuitiv in die Richtung eines Reizes zu drehen. Grundlagenforschung zur taktilen Wahrnehmung am Kopf und Studien zu verschiedenen Anwendungsfällen von kopfbasiertem taktilem Feedback werden in dieser Arbeit vorgestellt. Mehrere Untersuchungen und Nutzerstudien wurden durchgeführt zu (a) der Funneling Illusion und der Lokalisierungsgenauigkeit von taktilen Reizen am Kopf, (b) der Fähigkeit von Menschen, zwischen verschiedenen taktilen Mustern am Kopf zu unterscheiden, (c) Ansätzen zur Gestaltung taktiler Muster für komplexe Arrays von Aktoren, (d) der Erhöhung des Immersions- und Präsenzgrades von Virtual-Reality-Anwendungen und (e) der Unterstützung von Menschen mit Sehbehinderungen bei der Führung und Mikronavigation. Zusammenfassend wurde festgestellt, dass taktiles Feedback um den Kopf herum als zusätzlicher Informationskanal in verschiedenen Anwendungsszenarien sehr wertvoll ist. Am interessantesten ist die Navigation von sehbehinderten Personen durch einen Mikronavigations-Hindernisparcours, welche um eine Größenordnung präziser ist als der bisherige Stand der Technik, der einen taktilen Gürtel als Feedback-Modalität verwendete. Die Fähigkeit der taktilen Benutzerschnittstelle HapticHead, Menschen mit Sehbehinderungen mit einer mittleren Abweichung vom optimalen Pfad von weniger als 6~cm sicher um Hindernisse und auf Treppen zu navigieren, kann letztendlich die Lebensqualität vieler Menschen mit Sehbehinderungen verbessern

    Head-mounted Sensory Augmentation System for Navigation in Low Visibility Environments

    Get PDF
    Sensory augmentation can be used to assist in some tasks where sensory information is limited or sparse. This thesis focuses on the design and investigation of a head-mounted vibrotactile sensory augmentation interface to assist navigation in low visibility environments such as firefighters’ navigation or travel aids for visually impaired people. A novel head-mounted vibrotactile interface comprising a 1-by-7 vibrotactile display worn on the forehead is developed. A series of psychophysical studies is carried out with this display to (1) determine the vibrotactile absolute threshold, (2) investigate the accuracy of vibrotactile localization, and (3) evaluate the funneling illusion and apparent motion as sensory phenomena that could be used to communicate navigation signals. The results of these studies provide guidelines for the design of head-mounted interfaces. A 2nd generation head-mounted sensory augmentation interface called the Mark-II Tactile Helmet is developed for the application of firefighters’ navigation. It consists of a ring of ultrasound sensors mounted to the outside of a helmet, a microcontroller, two batteries and a refined vibrotactile display composed of seven vibration motors based on the results of the aforementioned psychophysical studies. A ‘tactile language’, that is, a set of distinguishable vibrotactile patterns, is developed for communicating navigation commands to the Mark-II Tactile Helmet. Four possible combinations of two command presentation modes (continuous, discrete) and two command types (recurring, single) are evaluated for their effectiveness in guiding users along a virtual wall in a structured environment. Continuous and discrete presentation modes use spatiotemporal patterns that induce the experience of apparent movement and discrete movement on the forehead, respectively. The recurring command type presents the tactile command repeatedly with an interval between patterns of 500 ms while the single command type presents the tactile command just once when there is a change in the command. The effectiveness of this tactile language is evaluated according to the objective measures of the users’ walking speed and the smoothness of their trajectory parallel to the virtual wall and subjective measures of utility and comfort employing Likert-type rating scales. The Recurring Continuous (RC) commands that exploit the phenomena of apparent motion are most effective in generating efficient routes and fast travel, and are most preferred. Finally, the optimal tactile language (RC) is compared with audio guidance using verbal instructions to investigate effectiveness in delivering navigation commands. The results show that haptic guidance leads to better performance as well as lower cognitive workload compared to auditory feedback. This research demonstrates that a head-mounted sensory augmentation interface can enhance spatial awareness in low visibility environments and could help firefighters’ navigation by providing them with supplementary sensory information
    corecore