124 research outputs found

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given

    A first investigation into the effectiveness of Tactons

    Get PDF
    This paper reports two experiments relating to the design of Tactons (or tactile icons). The first experiment investigated perception of vibro-tactile "roughness" (created using amplitude modulated sinusoids), and the results indicated that roughness could be used as a parameter for constructing Tactons. The second experiment is the first full evaluation of Tactons, and uses three values of roughness identified in the first experiment, along with three rhythms to create a set of Tactons. The results of this experiment showed that Tactons could be a successful means of communicating information in user interfaces, with an overall recognition rate of 71%, and recognition rates of 93% for rhythm and 80% for roughness

    Principles and Guidelines for Advancement of Touchscreen-Based Non-visual Access to 2D Spatial Information

    Get PDF
    Graphical materials such as graphs and maps are often inaccessible to millions of blind and visually-impaired (BVI) people, which negatively impacts their educational prospects, ability to travel, and vocational opportunities. To address this longstanding issue, a three-phase research program was conducted that builds on and extends previous work establishing touchscreen-based haptic cuing as a viable alternative for conveying digital graphics to BVI users. Although promising, this approach poses unique challenges that can only be addressed by schematizing the underlying graphical information based on perceptual and spatio-cognitive characteristics pertinent to touchscreen-based haptic access. Towards this end, this dissertation empirically identified a set of design parameters and guidelines through a logical progression of seven experiments. Phase I investigated perceptual characteristics related to touchscreen-based graphical access using vibrotactile stimuli, with results establishing three core perceptual guidelines: (1) a minimum line width of 1mm should be maintained for accurate line-detection (Exp-1), (2) a minimum interline gap of 4mm should be used for accurate discrimination of parallel vibrotactile lines (Exp-2), and (3) a minimum angular separation of 4mm should be used for accurate discrimination of oriented vibrotactile lines (Exp-3). Building on these parameters, Phase II studied the core spatio-cognitive characteristics pertinent to touchscreen-based non-visual learning of graphical information, with results leading to the specification of three design guidelines: (1) a minimum width of 4mm should be used for supporting tasks that require tracing of vibrotactile lines and judging their orientation (Exp-4), (2) a minimum width of 4mm should be maintained for accurate line tracing and learning of complex spatial path patterns (Exp-5), and (3) vibrotactile feedback should be used as a guiding cue to support the most accurate line tracing performance (Exp-6). Finally, Phase III demonstrated that schematizing line-based maps based on these design guidelines leads to development of an accurate cognitive map. Results from Experiment-7 provide theoretical evidence in support of learning from vision and touch as leading to the development of functionally equivalent amodal spatial representations in memory. Findings from all seven experiments contribute to new theories of haptic information processing that can guide the development of new touchscreen-based non-visual graphical access solutions

    Multimodal perception of histological images for persons blind or visually impaired

    Get PDF
    Currently there is no suitable substitute technology to enable blind or visually impaired (BVI) people to interpret visual scientific data commonly generated during lab experimentation in real time, such as performing light microscopy, spectrometry, and observing chemical reactions. This reliance upon visual interpretation of scientific data certainly impedes students and scientists that are BVI from advancing in careers in medicine, biology, chemistry, and other scientific fields. To address this challenge, a real-time multimodal image perception system is developed to transform standard laboratory blood smear images for persons with BVI to perceive, employing a combination of auditory, haptic, and vibrotactile feedbacks. These sensory feedbacks are used to convey visual information through alternative perceptual channels, thus creating a palette of multimodal, sensorial information. A Bayesian network is developed to characterize images through two groups of features of interest: primary and peripheral features. Causal relation links were established between these two groups of features. Then, a method was conceived for optimal matching between primary features and sensory modalities. Experimental results confirmed this real-time approach of higher accuracy in recognizing and analyzing objects within images compared to tactile images

    Wearable assistive tactile communication interface based on integrated touch sensors and actuators

    Get PDF
    This paper presents the design and fabrication of a wearable tactile communication interface with vibrotactile feedback for assistive communication. The interface is based on finger Braille, which is a simple and efficient tactile communication method used by deafblind people. It consists of a flexible piezoresistive sensor and a vibrotactile actuator integrated together and positioned at the index, middle and ring fingers of both hands to represent the six dots of Braille. The sensors were made using flexible piezoresistive material whereas the actuator utilizes electromagnetic principle by means of a flexible coil and a tiny NdFeB permanent magnet. Both were integrated to realize a Bluetooth-enabled tactile communication glove which enables deafblind people to communicate using Braille codes. The evaluation with 20 end-users (10 deafblind and 10 sighted and hearing person) of the tactile interface under standardized conditions demonstrated that users can feel and distinguish the vibration at frequencies ranging from 10Hz to 200Hz which is within the perceivable frequency range for the FA-II receptors. The results show that it took non-experts in Braille within 25s and 55s to send and receive words like “BEST” and “JOURNAL”, with an accuracy of ~75% and 68% respectively

    Sensory Communication

    Get PDF
    Contains table of contents for Section 2, an introduction and reports on fourteen research projects.National Institutes of Health Grant RO1 DC00117National Institutes of Health Grant RO1 DC02032National Institutes of Health/National Institute on Deafness and Other Communication Disorders Grant R01 DC00126National Institutes of Health Grant R01 DC00270National Institutes of Health Contract N01 DC52107U.S. Navy - Office of Naval Research/Naval Air Warfare Center Contract N61339-95-K-0014U.S. Navy - Office of Naval Research/Naval Air Warfare Center Contract N61339-96-K-0003U.S. Navy - Office of Naval Research Grant N00014-96-1-0379U.S. Air Force - Office of Scientific Research Grant F49620-95-1-0176U.S. Air Force - Office of Scientific Research Grant F49620-96-1-0202U.S. Navy - Office of Naval Research Subcontract 40167U.S. Navy - Office of Naval Research/Naval Air Warfare Center Contract N61339-96-K-0002National Institutes of Health Grant R01-NS33778U.S. Navy - Office of Naval Research Grant N00014-92-J-184

    Relative vibrotactile spatial acuity of the torso

    Get PDF
    While tactile acuity for pressure has been extensively investigated, far less is known about acuity for vibrotactile stimulation. Vibrotactile acuity is important however, as such stimulation is used in many applications, including sensory substitution devices. We tested discrimination of vibrotactile stimulation from eccentric rotating mass motors with in-plane vibration. In 3 experiments, we tested gradually decreasing center-to-center (c/c) distances from 30 mm (experiment 1) to 13 mm (experiment 3). Observers judged whether a second vibrating stimulator (‘tactor’) was to the left or right or in the same place as a first one that came on 250 ms before the onset of the second (with a 50-ms inter-stimulus interval). The results show that while accuracy tends to decrease the closer the tactors are, discrimination accuracy is still well above chance for the smallest distance, which places the threshold for vibrotactile stimulation well below 13 mm, which is lower than recent estimates. The results cast new light on vibrotactile sensitivity and can furthermore be of use in the design of devices that convey information through vibrotactile stimulation.Peer Reviewe

    Head-mounted Sensory Augmentation System for Navigation in Low Visibility Environments

    Get PDF
    Sensory augmentation can be used to assist in some tasks where sensory information is limited or sparse. This thesis focuses on the design and investigation of a head-mounted vibrotactile sensory augmentation interface to assist navigation in low visibility environments such as firefighters’ navigation or travel aids for visually impaired people. A novel head-mounted vibrotactile interface comprising a 1-by-7 vibrotactile display worn on the forehead is developed. A series of psychophysical studies is carried out with this display to (1) determine the vibrotactile absolute threshold, (2) investigate the accuracy of vibrotactile localization, and (3) evaluate the funneling illusion and apparent motion as sensory phenomena that could be used to communicate navigation signals. The results of these studies provide guidelines for the design of head-mounted interfaces. A 2nd generation head-mounted sensory augmentation interface called the Mark-II Tactile Helmet is developed for the application of firefighters’ navigation. It consists of a ring of ultrasound sensors mounted to the outside of a helmet, a microcontroller, two batteries and a refined vibrotactile display composed of seven vibration motors based on the results of the aforementioned psychophysical studies. A ‘tactile language’, that is, a set of distinguishable vibrotactile patterns, is developed for communicating navigation commands to the Mark-II Tactile Helmet. Four possible combinations of two command presentation modes (continuous, discrete) and two command types (recurring, single) are evaluated for their effectiveness in guiding users along a virtual wall in a structured environment. Continuous and discrete presentation modes use spatiotemporal patterns that induce the experience of apparent movement and discrete movement on the forehead, respectively. The recurring command type presents the tactile command repeatedly with an interval between patterns of 500 ms while the single command type presents the tactile command just once when there is a change in the command. The effectiveness of this tactile language is evaluated according to the objective measures of the users’ walking speed and the smoothness of their trajectory parallel to the virtual wall and subjective measures of utility and comfort employing Likert-type rating scales. The Recurring Continuous (RC) commands that exploit the phenomena of apparent motion are most effective in generating efficient routes and fast travel, and are most preferred. Finally, the optimal tactile language (RC) is compared with audio guidance using verbal instructions to investigate effectiveness in delivering navigation commands. The results show that haptic guidance leads to better performance as well as lower cognitive workload compared to auditory feedback. This research demonstrates that a head-mounted sensory augmentation interface can enhance spatial awareness in low visibility environments and could help firefighters’ navigation by providing them with supplementary sensory information

    Augmenting the Spatial Perception Capabilities of Users Who Are Blind

    Get PDF
    People who are blind face a series of challenges and limitations resulting from their lack of being able to see, forcing them to either seek the assistance of a sighted individual or work around the challenge by way of a inefficient adaptation (e.g. following the walls in a room in order to reach a door rather than walking in a straight line to the door). These challenges are directly related to blind users' lack of the spatial perception capabilities normally provided by the human vision system. In order to overcome these spatial perception related challenges, modern technologies can be used to convey spatial perception data through sensory substitution interfaces. This work is the culmination of several projects which address varying spatial perception problems for blind users. First we consider the development of non-visual natural user interfaces for interacting with large displays. This work explores the haptic interaction space in order to find useful and efficient haptic encodings for the spatial layout of items on large displays. Multiple interaction techniques are presented which build on prior research (Folmer et al. 2012), and the efficiency and usability of the most efficient of these encodings is evaluated with blind children. Next we evaluate the use of wearable technology in aiding navigation of blind individuals through large open spaces lacking tactile landmarks used during traditional white cane navigation. We explore the design of a computer vision application with an unobtrusive aural interface to minimize veering of the user while crossing a large open space. Together, these projects represent an exploration into the use of modern technology in augmenting the spatial perception capabilities of blind users
    corecore