22 research outputs found
Personalising Vibrotactile Displays through Perceptual Sensitivity Adjustment
Haptic displays are commonly limited to transmitting a discrete
set of tactile motives. In this paper, we explore the
transmission of real-valued information through vibrotactile
displays. We simulate spatial continuity with three perceptual
models commonly used to create phantom sensations: the linear,
logarithmic and power model. We show that these generic
models lead to limited decoding precision, and propose a
method for model personalization adjusting to idiosyncratic
and spatial variations in perceptual sensitivity. We evaluate
this approach using two haptic display layouts: circular, worn
around the wrist and the upper arm, and straight, worn along
the forearm. Results of a user study measuring continuous
value decoding precision show that users were able to decode
continuous values with relatively high accuracy (4.4% mean
error), circular layouts performed particularly well, and personalisation
through sensitivity adjustment increased decoding
precision
Touch-Screen Technology for the Dynamic Display of 2D Spatial Information Without Vision: Promise and Progress
Many developers wish to capitalize on touch-screen technology for developing aids for the blind, particularly by incorporating vibrotactile stimulation to convey patterns on their surfaces, which otherwise are featureless. Our belief is that they will need to take into account basic research on haptic perception in designing these graphics interfaces. We point out constraints and limitations in haptic processing that affect the use of these devices. We also suggest ways to use sound to augment basic information from touch, and we include evaluation data from users of a touch-screen device with vibrotactile and auditory feedback that we have been developing, called a vibro-audio interface
A new dynamic tactile display for reconfigurable braille: implementation and tests
Different tactile interfaces have been proposed to represent either text (braille) or, in a few cases, tactile large-area screens as replacements for visual displays. None of the implementations so far can be customized to match users' preferences, perceptual differences and skills. Optimal choices in these respects are still debated; we approach a solution by designing a flexible device allowing the user to choose key parameters of tactile transduction. We present here a new dynamic tactile display, a 8 × 8 matrix of plastic pins based on well-established and reliable piezoelectric technology to offer high resolution (pin gap 0.7mm) as well as tunable strength of the pins displacement, and refresh rate up to 50s(−1). It can reproduce arbitrary patterns, allowing it to serve the dual purpose of providing, depending on contingent user needs, tactile rendering of non-character information, and reconfigurable braille rendering. Given the relevance of the latter functionality for the expected average user, we considered testing braille encoding by volunteers a benchmark of primary importance. Tests were performed to assess the acceptance and usability with minimal training, and to check whether the offered flexibility was indeed perceived by the subject as an added value compared to conventional braille devices. Different mappings between braille dots and actual tactile pins were implemented to match user needs. Performances of eight experienced braille readers were defined as the fraction of correct identifications of rendered content. Different information contents were tested (median performance on random strings, words, sentences identification was about 75%, 85%, 98%, respectively, with a significant increase, p < 0.01), obtaining statistically significant improvements in performance during the tests (p < 0.05). Experimental results, together with qualitative ratings provided by the subjects, show a good acceptance and the effectiveness of the proposed solution
Detecting Deceptive Dark-Pattern Web Advertisements for Blind Screen-Reader Users
Advertisements have become commonplace on modern websites. While ads are typically designed for visual consumption, it is unclear how they affect blind users who interact with the ads using a screen reader. Existing research studies on non-visual web interaction predominantly focus on general web browsing; the specific impact of extraneous ad content on blind users\u27 experience remains largely unexplored. To fill this gap, we conducted an interview study with 18 blind participants; we found that blind users are often deceived by ads that contextually blend in with the surrounding web page content. While ad blockers can address this problem via a blanket filtering operation, many websites are increasingly denying access if an ad blocker is active. Moreover, ad blockers often do not filter out internal ads injected by the websites themselves. Therefore, we devised an algorithm to automatically identify contextually deceptive ads on a web page. Specifically, we built a detection model that leverages a multi-modal combination of handcrafted and automatically extracted features to determine if a particular ad is contextually deceptive. Evaluations of the model on a representative test dataset and \u27in-the-wild\u27 random websites yielded F1 scores of 0.86 and 0.88, respectively
The Effect of Programmable Tactile Displays on Spatial Learning Skills in Children and Adolescents of Different Visual Disability
Vision loss has severe impacts on physical, social and emotional well-being. The education of blind children poses issues as many scholar disciplines (e.g., geometry, mathematics) are normally taught by heavily relying on vision. Touch-based assistive technologies are potential tools to provide graphical contents to blind users, improving learning possibilities and social inclusion. Raised-lines drawings are still the golden standard, but stimuli cannot be reconfigured or adapted and the blind person constantly requires assistance. Although much research concerns technological development, little work concerned the assessment of programmable tactile graphics, in educative and rehabilitative contexts. Here we designed, on programmable tactile displays, tests aimed at assessing spatial memory skills and shapes recognition abilities. Tests involved a group of blind and a group of low vision children and adolescents in a four-week longitudinal schedule. After establishing subject-specific difficulty levels, we observed a significant enhancement of performance across sessions and for both groups. Learning effects were comparable to raised paper control tests: however, our setup required minimal external assistance. Overall, our results demonstrate that programmable maps are an effective way to display graphical contents in educative/rehabilitative contexts. They can be at least as effective as traditional paper tests yet providing superior flexibility and versatility
Recommended from our members
A Haptic Surface Robot Interface for Large-Format Touchscreen Displays
This thesis presents the design for a novel haptic interface for large-format touchscreens. Techniques such as electrovibration, ultrasonic vibration, and external braked devices have been developed by other researchers to deliver haptic feedback to touchscreen users. However, these methods do not address the need for spatial constraints that only restrict user motion in the direction of the constraint. This technology gap contributes to the lack of haptic technology available for touchscreen-based upper-limb rehabilitation, despite the prevalent use of haptics in other forms of robotic rehabilitation. The goal of this thesis is to display kinesthetic haptic constraints to the touchscreen user in the form of boundaries and paths, which assist or challenge the user in interacting with the touchscreen. The presented prototype accomplishes this by steering a single wheel in contact with the display while remaining driven by the user. It employs a novel embedded force sensor, which it uses to measure the interaction force between the user and the touchscreen. The haptic response of the device is controlled using this force data to characterize user intent. The prototype can operate in a simulated free mode as well as simulate rigid and compliant obstacles and path constraints. A data architecture has been created to allow the prototype to be used as a peripheral add-on device which reacts to haptic environments created and modified on the touchscreen. The long-term goal of this work is to create a haptic system that enables a touchscreen-based rehabilitation platform for people with upper limb impairments
Making Spatial Information Accessible on Touchscreens for Users who are Blind and Visually Impaired
Touchscreens have become a de facto standard of input for mobile devices as they most optimally use the limited input and output space that is imposed by their form factor. In recent years, people who are blind and visually impaired have been increasing their usage of smartphones and touchscreens. Although basic access is available, there are still many accessibility issues left to deal with in order to bring full inclusion to this population. One of the important challenges lies in accessing and creating of spatial information on touchscreens. The work presented here provides three new techniques, using three different modalities, for accessing spatial information on touchscreens. The first system makes geometry and diagram creation accessible on a touchscreen through the use of text-to-speech and gestural input. This first study is informed by a qualitative study of how people who are blind and visually impaired currently access and create graphs and diagrams. The second system makes directions through maps accessible using multiple vibration sensors without any sound or visual output. The third system investigates the use of binaural sound on a touchscreen to make various types of applications accessible such as physics simulations, astronomy, and video games
Designing Haptic Clues for Touchscreen Kiosks
Most interactive touchscreen kiosks are a challenge to accessibility: if graphics and sound fail in communication, the interaction process halts. In such a case, turning to the only remaining environmentally suited sense - the touch - is an intuitive option.
To reinforce the interaction with interactive touchscreen kiosks it is possible to add haptic (touchable) feedback into the features of the device. The range of touchscreen-suited haptic technologies already enables some touch feedback from touchscreen surfaces and significant leaps still forward are being made at a constant rate. Due to this development it is relevant to review the human-centred factors affecting the design of haptic touchscreen in public kiosks.
This thesis offers an overview for designing haptic clues for touchscreen kiosks. It emphasizes context sensitivity and the meaningfulness and communicability of different haptic design variants. As the main contribution, this thesis collects together the important considerations for the conscious design of haptic features in interactive kiosks and offers points of multimodal design considerations for designers intending to enrich their touchscreen interaction with haptic features
A Haptic System for Depicting Mathematical Graphics for Students with Visual Impairments
When teaching students with visual impairments educators generally rely on tactile tools to depict visual mathematical topics. Tactile media, such as embossed paper and simple manipulable materials, are typically used to convey graphical information. Although these tools are easy to use and relatively inexpensive, they are solely tactile and are not modifiable. Dynamic and interactive technologies such as pin matrices and haptic pens are also commercially available, but tend to be more expensive and less intuitive. This study aims to bridge the gap between easy-to-use tactile tools and dynamic, interactive technologies in order to facilitate the haptic learning of mathematical concepts. We developed an haptic assistive device using a Tanvas electrostatic touchscreen that provides the user with multimodal (haptic, auditory, and visual) output. Three methodological steps comprise this research: 1) a systematic literature review of the state of the art in the design and testing of tactile and haptic assistive devices, 2) a user-centered system design, and 3) testing of the system’s effectiveness via a usability study. The electrostatic touchscreen exhibits promise as an assistive device for displaying visual mathematical elements via the haptic modality