81 research outputs found

    Exploitation of haptic renderings to communicate risk levels of falling

    Get PDF
    Falls represent a major cause of injury that could lead to death. This observation is even more accentuated in the elderly. Indeed, with aging comes some deterioration (gait disturbances, balance disorders, and sensory motor impairments) that may lead to falls. The research project presented in this thesis is focused on the problem of reducing the risk level of falling. This study proposes a solution for the communication of haptic information to reduce the risk of falling. This solution is part of the design of a haptic communication system in a controlled environment. This new system introduces the notion of haptic perception through the communication of information by touch using the foot, which the literature does not generally mention. For the design of this system, we first studied the use of tactile stimuli to evaluate the possibility of communicating a risk level through a haptic modality. Then, having hypothesized that some factors could influence the communication of stimuli representing the risk levels of falling, we conducted a second study to evaluate the effect of auditory disturbances during the communication of these stimuli. Third, to determine whether the user had the necessary time to act after the perception of the risk level, we analyzed a variation of the simple reaction time when walking on different types of soil. These results encouraged us to do a fourth assessment of reaction time using a new device coupled with a smartphone that can be positioned at different locations on the body. Several experiments have been done to validate each of the steps. With this, we can now communicate a risk level of falling to users through the haptic channel using an active device and easily differentiable stimuli. In addition, we can evaluate auditory factors during such a haptic perception. Finally, we can evaluate the physiological characteristics of the users (response time) while seated and while walking on different types of soil. Les chutes représentent une cause majeure de blessures pouvant entraîner la mort. Cette observation est encore plus accentuée chez les personnes âgées. En effet, avec le vieillissement, certaines détériorations (troubles de la démarche, troubles de l’équilibre, troubles sensorimoteurs) peuvent entraîner des chutes. Le projet de recherche présenté dans cette thèse fait partie du problème de la réduction du risque de chute. En particulier, cette étude propose une solution au problème de la réduction du risque de chute par la perception haptiques. Cette solution intègre la conception d’un système de communication haptique dans un environnement contrôlé. Ce nouveau système introduit la notion de perception haptique à travers la communication de l’information par le toucher avec le pied, que la littérature ne mentionne généralement pas. Pour cela nous avons d’abord étudié l’utilisation de stimuli tactiles pour évaluer la possibilité de communiquer un niveau de risque par la modalité haptique. Puis, ayant émis l’hypothèse que certains facteurs pourraient influencer la communication de ces stimuli, nous avons mené une deuxième étude pour évaluer l’impact des perturbations auditives lors de la perception haptique du niveau de risque. Troisièmement, afin de savoir si l’utilisateur avait le temps nécessaire pour agir après la perception du niveau de risque, nous avons analysé la variation du temps de réaction simple en marchant sur différents types de sols. Les résultats obtenus dans cette dernière étude nous ont motivé à faire une quatrième évaluation du temps de réaction mais en utilisant un nouveau dispositif couplé à un smartphone qui peut être positionné à différents endroits du corps. Plusieurs expériences ont été réalisées pour valider chacune des étapes. Avec toutes ces études, nous pouvons maintenant communiquer aux utilisateurs un niveau de risque à travers le canal haptique en utilisant un dispositif actif et des stimuli facilement différentiables. En outre, nous pouvons évaluer les facteurs externes (auditifs) au cours d’une telle perception haptique. Enfin, nous pouvons évaluer les caractéristiques physiologiques des utilisateurs (temps de réponse) en position assise et en marchant sur différents types de sols

    Developing an interactive overview for non-visual exploration of tabular numerical information

    Get PDF
    This thesis investigates the problem of obtaining overview information from complex tabular numerical data sets non-visually. Blind and visually impaired people need to access and analyse numerical data, both in education and in professional occupations. Obtaining an overview is a necessary first step in data analysis, for which current non-visual data accessibility methods offer little support. This thesis describes a new interactive parametric sonification technique called High-Density Sonification (HDS), which facilitates the process of extracting overview information from the data easily and efficiently by rendering multiple data points as single auditory events. Beyond obtaining an overview of the data, experimental studies showed that the capabilities of human auditory perception and cognition to extract meaning from HDS representations could be used to reliably estimate relative arithmetic mean values within large tabular data sets. Following a user-centred design methodology, HDS was implemented as the primary form of overview information display in a multimodal interface called TableVis. This interface supports the active process of interactive data exploration non-visually, making use of proprioception to maintain contextual information during exploration (non-visual focus+context), vibrotactile data annotations (EMA-Tactons) that can be used as external memory aids to prevent high mental workload levels, and speech synthesis to access detailed information on demand. A series of empirical studies was conducted to quantify the performance attained in the exploration of tabular data sets for overview information using TableVis. This was done by comparing HDS with the main current non-visual accessibility technique (speech synthesis), and by quantifying the effect of different sizes of data sets on user performance, which showed that HDS resulted in better performance than speech, and that this performance was not heavily dependent on the size of the data set. In addition, levels of subjective workload during exploration tasks using TableVis were investigated, resulting in the proposal of EMA-Tactons, vibrotactile annotations that the user can add to the data in order to prevent working memory saturation in the most demanding data exploration scenarios. An experimental evaluation found that EMA-Tactons significantly reduced mental workload in data exploration tasks. Thus, the work described in this thesis provides a basis for the interactive non-visual exploration of a broad range of sizes of numerical data tables by offering techniques to extract overview information quickly, performing perceptual estimations of data descriptors (relative arithmetic mean) and managing demands on mental workload through vibrotactile data annotations, while seamlessly linking with explorations at different levels of detail and preserving spatial data representation metaphors to support collaboration with sighted users

    A vibrotactile display design for the feedback of external prosthesis sensory information to the amputee wearer

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2006.Includes bibliographical references (p. 60-64).This thesis documents the development of a vibrotactile display to be incorporated into a powered ankle-foot prosthesis. Although existing devices have addressed the need for tactile and proprioceptive feedback in external prostheses, there has not yet been an attempt to develop and clinically evaluate a comprehensive vibrotactile display and signaling schematic for use with an active myoelectric prosthesis. The development and evaluation of two different hardware solutions are presented including an array of vibrating pancake motors embedded into the exterior of a carbon fiber prosthetic socket and an array of vibrating pancake motors embedded into a silicone socket liner. Three haptic mappings were designed based on previous work in psychophysics, haptics, and HCI. These schematics include a spatial discrimination pattern, an amplitude modulated pattern, and a gap detection pattern. To assess the effectiveness of the system, lower-limb amputees were asked to learn the three haptic mappings and use the feedback system to control a virtual ankle to a desired ankle position using a physical knob interface. Results show an overall recognition rate of 85% for all three haptic mappings and error response averages ranging from 8.2 s to 11.6 s.(cont.) The high recognition rates and lack of variance between the mappings suggest that the three vibration parameters of spatial discrimination, amplitude modulation, and gap detection may be successfully used to represent different ankle parameters. However, the overall successful integration of the vibrotactile display ultimately depends on the interaction between the components of the whole prosthetic system.by Andrea W. Chew.S.M

    Somatic ABC's: A Theoretical Framework for Designing, Developing and Evaluating the Building Blocks of Touch-Based Information Delivery

    Get PDF
    abstract: Situations of sensory overload are steadily becoming more frequent as the ubiquity of technology approaches reality--particularly with the advent of socio-communicative smartphone applications, and pervasive, high speed wireless networks. Although the ease of accessing information has improved our communication effectiveness and efficiency, our visual and auditory modalities--those modalities that today's computerized devices and displays largely engage--have become overloaded, creating possibilities for distractions, delays and high cognitive load; which in turn can lead to a loss of situational awareness, increasing chances for life threatening situations such as texting while driving. Surprisingly, alternative modalities for information delivery have seen little exploration. Touch, in particular, is a promising candidate given that it is our largest sensory organ with impressive spatial and temporal acuity. Although some approaches have been proposed for touch-based information delivery, they are not without limitations including high learning curves, limited applicability and/or limited expression. This is largely due to the lack of a versatile, comprehensive design theory--specifically, a theory that addresses the design of touch-based building blocks for expandable, efficient, rich and robust touch languages that are easy to learn and use. Moreover, beyond design, there is a lack of implementation and evaluation theories for such languages. To overcome these limitations, a unified, theoretical framework, inspired by natural, spoken language, is proposed called Somatic ABC's for Articulating (designing), Building (developing) and Confirming (evaluating) touch-based languages. To evaluate the usefulness of Somatic ABC's, its design, implementation and evaluation theories were applied to create communication languages for two very unique application areas: audio described movies and motor learning. These applications were chosen as they presented opportunities for complementing communication by offloading information, typically conveyed visually and/or aurally, to the skin. For both studies, it was found that Somatic ABC's aided the design, development and evaluation of rich somatic languages with distinct and natural communication units.Dissertation/ThesisPh.D. Computer Science 201

    Effects of modality, urgency and situation on responses to multimodal warnings for drivers

    Get PDF
    Signifying road-related events with warnings can be highly beneficial, especially when imminent attention is needed. This thesis describes how modality, urgency and situation can influence driver responses to multimodal displays used as warnings. These displays utilise all combinations of audio, visual and tactile modalities, reflecting different urgency levels. In this way, a new rich set of cues is designed, conveying information multimodally, to enhance reactions during driving, which is a highly visual task. The importance of the signified events to driving is reflected in the warnings, and safety-critical or non-critical situations are communicated through the cues. Novel warning designs are considered, using both abstract displays, with no semantic association to the signified event, and language-based ones, using speech. These two cue designs are compared, to discover their strengths and weaknesses as car alerts. The situations in which the new cues are delivered are varied, by simulating both critical and non-critical events and both manual and autonomous car scenarios. A novel set of guidelines for using multimodal driver displays is finally provided, considering the modalities utilised, the urgency signified, and the situation simulated

    DOKY: A Multi-Modal User Interface for Non-Visual Presentation, Navigation and Manipulation of Structured Documents on Mobile and Wearable Devices

    Get PDF
    There are a large number of highly structured documents available on the Internet. The logical document structure is very important for the reader in order to efficiently handling the document content. In graphical user interfaces, each logical structure element is presented by a specific visualisation, a graphical icon. This representation allows visual readers to recognise the structure at a glance. Another advantage is that it enables direct navigation and manipulation. Blind and visually impaired persons are unable to use graphical user interfaces and for the emerging category of mobile and wearable devices, where there are only small visual displays available or no visual display at all, a non-visual alternative is required too. A multi-modal user interface for non-visual presentation, navigation and manipulation of structured documents on mobile and wearable devices like smart phones, smart watches or smart tablets has been developed as a result of inductive research among 205 blind and visually impaired participants. It enables the user to get a fast overview over the document structure and to efficiently skim and scan over the document content by identifying the type, level, position, length, relationship and content text of each element as well as to focus, select, activate, move, remove and insert structure elements or text. These interactions are presented in a non-visual way using Earcons, Tactons and synthetic speech utterances, serving the auditory and tactile human sense. Navigation and manipulation is provided by using the multitouch, motion (linear acceleration and rotation) or speech recognition input modality. It is a complete solution for reading, creating and editing structured documents in a non-visual way. There is no special hardware required. The name DOKY is derived from a short form of the terms document, and accessibility. A flexible platform-independent and event-driven software architecture implementing the DOKY user interface as well as the automated structured observation research method employed for the investigation into the effectiveness of the proposed user interface has been presented. Because it is platform- and language-neutral, it can be used in a wide variety of platforms, environments and applications for mobile and wearable devices. Each component is defined by interfaces and abstract classes only, so that it can be easily changed or extended, and grouped in a semantically self-containing package. An investigation into the effectiveness of the proposed DOKY user interface has been carried out to see whether the proposed user interface design concepts and user interaction design concepts are effective means for non-visual presentation, navigation and manipulation of structured documents on mobile and wearable devices, by automated structured observations of 876 blind and visually impaired research subjects performing 19 exercises among a highly structured example document using the DOKY Structured Observation App on their own mobile or wearable device remotely over the Internet. The results showed that the proposed user interface design concepts for presentation and navigation and the user interaction design concepts for manipulation are effective and that their effectiveness depends on the input modality and hardware device employed as well as on the use of screen readers

    Interactive maps for visually impaired people : design, usability and spatial cognition

    Get PDF
    Connaître la géographie de son environnement urbain est un enjeu important pour les personnes déficientes visuelles. Des cartes tactiles en relief sont généralement utilisées mais elles présentent des limitations importantes (nombre limité d'informations, recours à une légende braille). Les nouvelles technologies permettent d'envisager des solutions innovantes. Nous avons conçu et développé une carte interactive accessible, en suivant un processus de conception participative. Cette carte est basée sur un dispositif multi-touch, une carte tactile en relief et une sortie sonore. Ce dispositif permet au sujet de recueillir des informations en double-cliquant sur certains objets de la carte. Nous avons démontré expérimentalement que ce prototype était plus efficace et plus satisfaisant pour des utilisateurs déficients visuels qu'une carte tactile simple. Nous avons également exploré et testé différents types d'interactions avancées accessibles pour explorer la carte. Cette thèse démontre l'importance des cartes tactiles interactives pour les déficients visuels et leur cognition spatiale.Knowing the geography of an urban environment is crucial for visually impaired people. Tactile relief maps are generally used, but they retain significant limitations (limited amount of information, use of braille legend, etc.). Recent technological progress allows the development of innovative solutions which overcome these limitations. In this thesis, we present the design of an accessible interactive map through a participatory design process. This map is composed by a multi-touch screen with tactile map overlay and speech output. It provides auditory information when tapping on map elements. We have demonstrated in an experiment that our prototype was more effective and satisfactory for visually impaired users than a simple raised-line map. We also explored and tested different types of advanced non-visual interaction for exploring the map. This thesis demonstrates the importance of interactive tactile maps for visually impaired people and their spatial cognition

    The application of multiple modalities to improve home care and reminder systems

    Get PDF
    Existing home care technology tends to be pre-programmed systems limited to one or two interaction modalities. This can make them inaccessible to people with sensory impairments and unable to cope with a dynamic and heterogeneous environment such as the home. This thesis presents research that considers how home care technology can be improved through employing multiple visual, aural, tactile and even olfactory interaction methods. A wide range of modalities were tested to gather a better insight into their properties and merits. That information was used to design and construct Dyna-Cue, a prototype multimodal reminder system. Dyna-Cue was designed to use multiple modalities and to switch between them in real time to maintain higher levels of effectiveness and acceptability. The Dyna-Cue prototype was evaluated against other models of reminder delivery and was shown to be an effective and appropriate tool that can help people to manage their time and activities

    Purring Wheel: Thermal and Vibrotactile Notifications on the Steering Wheel

    Get PDF
    Haptic feedback can improve safety and driving behaviour. While vibration has been widely studied, other haptic modalities have been neglected. To address this, we present two studies investigating the use of uni- and bimodal vibrotactile and thermal cues on the steering wheel. First, notifications with three levels of urgency were subjectively rated and then identified during simulated driving. Bimodal feedback showed an increased identification time over unimodal vibrotactile cues. Thermal feedback was consistently rated less urgent, showing its suitability for less time critical notifications, where vibration would be unnecessarily attention-grabbing. The second study investigated more complex thermal and bimodal haptic notifications comprised of two different types of information (Nature and Importance of incoming message). Results showed that both modalities could be identified with high recognition rates of up to 92% for both and up to 99% for a single type, opening up a novel design space for haptic in-car feedback

    Voice and Touch Diagrams (VATagrams) Diagrams for the Visually Impaired

    Get PDF
    If a picture is worth a thousand words would you rather read the two pages of text or simply view the image? Most would choose to view the image; however, for the visually impaired this isn’t always an option. Diagrams assist people in visualizing relationships between objects. Most often these diagrams act as a source for quickly referencing information about relationships. Diagrams are highly visual and as such, there are few tools to support diagram creation for visually impaired individuals. To allow the visually impaired the ability to share the same advantages in school and work as sighted colleagues, an accessible diagram tool is needed. A suitable tool for the visually impaired to create diagrams should allow these individuals to: 1. easily define the type of relationship based diagram to be created, 2. easily create the components of a relationship based diagram, 3. easily modify the components of a relationship based diagram, 4. quickly understand the structure of a relationship based diagram, 5. create a visual representation which can be used by the sighted, and 6. easily accesses reference points for tracking diagram components. To do this a series of prototypes of a tool were developed that allow visually impaired users the ability to read, create, modify and share relationship based diagrams using sound and gestural touches. This was accomplished by creating a series of applications that could be run on an iPad using an overlay that restricts the areas in which a user can perform gestures. These prototypes were tested for usability using measures of efficiency, effectiveness and satisfaction. The prototypes were tested with visually impaired, blindfolded and sighted participants. The results of the evaluation indicate that the prototypes contain the main building blocks that can be used to complete a fully functioning application to be used on an iPad
    • …
    corecore