183 research outputs found

    Developing an interactive overview for non-visual exploration of tabular numerical information

    Get PDF
    This thesis investigates the problem of obtaining overview information from complex tabular numerical data sets non-visually. Blind and visually impaired people need to access and analyse numerical data, both in education and in professional occupations. Obtaining an overview is a necessary first step in data analysis, for which current non-visual data accessibility methods offer little support. This thesis describes a new interactive parametric sonification technique called High-Density Sonification (HDS), which facilitates the process of extracting overview information from the data easily and efficiently by rendering multiple data points as single auditory events. Beyond obtaining an overview of the data, experimental studies showed that the capabilities of human auditory perception and cognition to extract meaning from HDS representations could be used to reliably estimate relative arithmetic mean values within large tabular data sets. Following a user-centred design methodology, HDS was implemented as the primary form of overview information display in a multimodal interface called TableVis. This interface supports the active process of interactive data exploration non-visually, making use of proprioception to maintain contextual information during exploration (non-visual focus+context), vibrotactile data annotations (EMA-Tactons) that can be used as external memory aids to prevent high mental workload levels, and speech synthesis to access detailed information on demand. A series of empirical studies was conducted to quantify the performance attained in the exploration of tabular data sets for overview information using TableVis. This was done by comparing HDS with the main current non-visual accessibility technique (speech synthesis), and by quantifying the effect of different sizes of data sets on user performance, which showed that HDS resulted in better performance than speech, and that this performance was not heavily dependent on the size of the data set. In addition, levels of subjective workload during exploration tasks using TableVis were investigated, resulting in the proposal of EMA-Tactons, vibrotactile annotations that the user can add to the data in order to prevent working memory saturation in the most demanding data exploration scenarios. An experimental evaluation found that EMA-Tactons significantly reduced mental workload in data exploration tasks. Thus, the work described in this thesis provides a basis for the interactive non-visual exploration of a broad range of sizes of numerical data tables by offering techniques to extract overview information quickly, performing perceptual estimations of data descriptors (relative arithmetic mean) and managing demands on mental workload through vibrotactile data annotations, while seamlessly linking with explorations at different levels of detail and preserving spatial data representation metaphors to support collaboration with sighted users

    Enabling audio-haptics

    Get PDF
    This thesis deals with possible solutions to facilitate orientation, navigation and overview of non-visual interfaces and virtual environments with the help of sound in combination with force-feedback haptics. Applications with haptic force-feedback, s

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Multimodal interaction: developing an interaction concept for a touchscreen incorporating tactile feedback

    Get PDF
    The touchscreen, as an alternative user interface for applications that normally require mice and keyboards, has become more and more commonplace, showing up on mobile devices, on vending machines, on ATMs and in the control panels of machines in industry, where conventional input devices cannot provide intuitive, rapid and accurate user interaction with the content of the display. The exponential growth in processing power on the PC, together with advances in understanding human communication channels, has had a significant effect on the design of usable, human-factored interfaces on touchscreens, and on the number and complexity of applications available on touchscreens. Although computer-driven touchscreen interfaces provide programmable and dynamic displays, the absence of the expected tactile cues on the hard and static surfaces of conventional touchscreens is challenging interface design and touchscreen usability, in particular for distracting, low-visibility environments. Current technology allows the human tactile modality to be used in touchscreens. While the visual channel converts graphics and text unidirectionally from the computer to the end user, tactile communication features a bidirectional information flow to and from the user as the user perceives and acts on the environment and the system responds to changing contextual information. Tactile sensations such as detents and pulses provide users with cues that make selecting and controlling a more intuitive process. Tactile features can compensate for deficiencies in some of the human senses, especially in tasks which carry a heavy visual or auditory burden. In this study, an interaction concept for tactile touchscreens is developed with a view to employing the key characteristics of the human sense of touch effectively and efficiently, especially in distracting environments where vision is impaired and hearing is overloaded. As a first step toward improving the usability of touchscreens through the integration of tactile effects, different mechanical solutions for producing motion in tactile touchscreens are investigated, to provide a basis for selecting suitable vibration directions when designing tactile displays. Building on these results, design know-how regarding tactile feedback patterns is further developed to enable dynamic simulation of UI controls, in order to give users a sense of perceiving real controls on a highly natural touch interface. To study the value of adding tactile properties to touchscreens, haptically enhanced UI controls are then further investigated with the aim of mapping haptic signals to different usage scenarios to perform primary and secondary tasks with touchscreens. The findings of the study are intended for consideration and discussion as a guide to further development of tactile stimuli, haptically enhanced user interfaces and touchscreen applications

    Proceedings of the 6th international conference on disability, virtual reality and associated technologies (ICDVRAT 2006)

    Get PDF
    The proceedings of the conferenc

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use

    Tactile Arrays for Virtual Textures

    Get PDF
    This thesis describes the development of three new tactile stimulators for active touch, i.e. devices to deliver virtual touch stimuli to the fingertip in response to exploratory movements by the user. All three stimulators are designed to provide spatiotemporal patterns of mechanical input to the skin via an array of contactors, each under individual computer control. Drive mechanisms are based on piezoelectric bimorphs in a cantilever geometry. The first of these is a 25-contactor array (5 Ă— 5 contactors at 2 mm spacing). It is a rugged design with a compact drive system and is capable of producing strong stimuli when running from low voltage supplies. Combined with a PC mouse, it can be used for active exploration tasks. Pilot studies were performed which demonstrated that subjects could successfully use the device for discrimination of line orientation, simple shape identification and line following tasks. A 24-contactor stimulator (6 Ă— 4 contactors at 2 mm spacing) with improved bandwidth was then developed. This features control electronics designed to transmit arbitrary waveforms to each channel (generated on-the-fly, in real time) and software for rapid development of experiments. It is built around a graphics tablet, giving high precision position capability over a large 2D workspace. Experiments using two-component stimuli (components at 40 Hz and 320 Hz) indicate that spectral balance within active stimuli is discriminable independent of overall intensity, and that the spatial variation (texture) within the target is easier to detect at 320 Hz that at 40 Hz. The third system developed (again 6 Ă— 4 contactors at 2 mm spacing) was a lightweight modular stimulator developed for fingertip and thumb grasping tasks; furthermore it was integrated with force-feedback on each digit and a complex graphical display, forming a multi-modal Virtual Reality device for the display of virtual textiles. It is capable of broadband stimulation with real-time generated outputs derived from a physical model of the fabric surface. In an evaluation study, virtual textiles generated from physical measurements of real textiles were ranked in categories reflecting key mechanical and textural properties. The results were compared with a similar study performed on the real fabrics from which the virtual textiles had been derived. There was good agreement between the ratings of the virtual textiles and the real textiles, indicating that the virtual textiles are a good representation of the real textiles and that the system is delivering appropriate cues to the user

    Voice and Touch Diagrams (VATagrams) Diagrams for the Visually Impaired

    Get PDF
    If a picture is worth a thousand words would you rather read the two pages of text or simply view the image? Most would choose to view the image; however, for the visually impaired this isn’t always an option. Diagrams assist people in visualizing relationships between objects. Most often these diagrams act as a source for quickly referencing information about relationships. Diagrams are highly visual and as such, there are few tools to support diagram creation for visually impaired individuals. To allow the visually impaired the ability to share the same advantages in school and work as sighted colleagues, an accessible diagram tool is needed. A suitable tool for the visually impaired to create diagrams should allow these individuals to: 1. easily define the type of relationship based diagram to be created, 2. easily create the components of a relationship based diagram, 3. easily modify the components of a relationship based diagram, 4. quickly understand the structure of a relationship based diagram, 5. create a visual representation which can be used by the sighted, and 6. easily accesses reference points for tracking diagram components. To do this a series of prototypes of a tool were developed that allow visually impaired users the ability to read, create, modify and share relationship based diagrams using sound and gestural touches. This was accomplished by creating a series of applications that could be run on an iPad using an overlay that restricts the areas in which a user can perform gestures. These prototypes were tested for usability using measures of efficiency, effectiveness and satisfaction. The prototypes were tested with visually impaired, blindfolded and sighted participants. The results of the evaluation indicate that the prototypes contain the main building blocks that can be used to complete a fully functioning application to be used on an iPad
    • …
    corecore