5,985 research outputs found
Tac-tiles: multimodal pie charts for visually impaired users
Tac-tiles is an accessible interface that allows visually impaired users to browse graphical information using tactile and audio feedback. The system uses a graphics tablet which is augmented with a tangible overlay tile to guide user exploration. Dynamic feedback is provided by a tactile pin-array at the fingertips, and through speech/non-speech audio cues. In designing the system, we seek to preserve the affordances and metaphors of traditional, low-tech teaching media for the blind, and combine this with the benefits of a digital representation. Traditional tangible media allow rapid, non-sequential access to data, promote easy and unambiguous access to resources such as axes and gridlines, allow the use of external memory, and preserve visual conventions, thus promoting collaboration with sighted colleagues. A prototype system was evaluated with visually impaired users, and recommendations for multimodal design were derived
Constructing sonified haptic line graphs for the blind student: first steps
Line graphs stand as an established information visualisation and analysis technique taught at various levels of difficulty according to standard Mathematics curricula. It has been argued that blind individuals cannot use line graphs as a visualisation and analytic tool because they currently primarily exist in the visual medium. The research described in this paper aims at making line graphs accessible to blind students through auditory and haptic media. We describe (1) our design space for representing line graphs, (2) the technology we use to develop our prototypes and (3) the insights from our preliminary work
SoundBar: exploiting multiple views in multimodal graph browsing
In this paper we discuss why access to mathematical graphs is problematic for visually impaired people. By a review of graph understanding theory and interviews with visually impaired users, we explain why current non-visual representations are unlikely to provide effective access to graphs. We propose the use of multiple views of the graph, each providing quick access to specific information as a way to improve graph usability. We then introduce a specific multiple view system to improve access to bar graphs called SoundBar which provides an additional quick audio overview of the graph. An evaluation of SoundBar revealed that additional views significantly increased accuracy and reduced time taken in a question answering task
Using Augmented Reality as a Medium to Assist Teaching in Higher Education
In this paper we describe the use of a high-level augmented reality
(AR) interface for the construction of collaborative educational applications
that can be used in practice to enhance current teaching
methods. A combination of multimedia information including spatial
three-dimensional models, images, textual information, video,
animations and sound, can be superimposed in a student-friendly
manner into the learning environment. In several case studies different
learning scenarios have been carefully designed based on
human-computer interaction principles so that meaningful virtual
information is presented in an interactive and compelling way. Collaboration
between the participants is achieved through use of a
tangible AR interface that uses marker cards as well as an immersive
AR environment which is based on software user interfaces
(UIs) and hardware devices. The interactive AR interface has been
piloted in the classroom at two UK universities in departments of
Informatics and Information Science
Tactons: structured tactile messages for non-visual information display
Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given
Using Sound to Represent Uncertainty in Spatial Data
There is a limit to the amount of spatial data that can be shown visually in an effective manner, particularly when the data sets are extensive or complex. Using sound to represent some of these data (sonification) is a way of avoiding visual overload. This thesis creates a conceptual model showing how sonification can be used to represent spatial data and evaluates a number of elements within the conceptual model. These are examined in three different case studies to assess the effectiveness of the sonifications.
Current methods of using sonification to represent spatial data have been restricted by the technology available and have had very limited user testing. While existing research shows that sonification can be done, it does not show whether it is an effective and useful method of representing spatial data to the end user. A number of prototypes show how spatial data can be sonified, but only a small handful of these have performed any user testing beyond the authors’ immediate colleagues (where n > 4). This thesis creates and evaluates sonification prototypes, which represent uncertainty using three different case studies of spatial data. Each case study is evaluated by a significant user group (between 45 and 71 individuals) who completed a task based evaluation with the sonification tool, as well as reporting qualitatively their views on the effectiveness and usefulness of the sonification method.
For all three case studies, using sound to reinforce information shown visually results in more effective performance from the majority of the participants than traditional visual methods. Participants who were familiar with the dataset were much more effective at using the sonification than those who were not and an interactive sonification which requires significant involvement from the user was much more effective than a static sonification, which did not provide significant user engagement. Using sounds with a clear and easily understood scale (such as piano notes) was important to achieve an effective sonification. These findings are used to improve the conceptual model developed earlier in this thesis and highlight areas for future research
Integrating Haptic Feedback into Mobile Location Based Services
Haptics is a feedback technology that takes advantage of the human sense of touch by
applying forces, vibrations, and/or motions to a haptic-enabled device such as a mobile
phone. Historically, human-computer interaction has been visual - text and images on
the screen. Haptic feedback can be an important additional method especially in Mobile
Location Based Services such as knowledge discovery, pedestrian navigation and notification
systems. A knowledge discovery system called the Haptic GeoWand is a low
interaction system that allows users to query geo-tagged data around them by using
a point-and-scan technique with their mobile device. Haptic Pedestrian is a navigation
system for walkers. Four prototypes have been developed classified according to
the user’s guidance requirements, the user type (based on spatial skills), and overall
system complexity. Haptic Transit is a notification system that provides spatial information
to the users of public transport. In all these systems, haptic feedback is used
to convey information about location, orientation, density and distance by use of the
vibration alarm with varying frequencies and patterns to help understand the physical
environment. Trials elicited positive responses from the users who see benefit in being
provided with a “heads up” approach to mobile navigation. Results from a memory recall
test show that the users of haptic feedback for navigation had better memory recall
of the region traversed than the users of landmark images. Haptics integrated into a
multi-modal navigation system provides more usable, less distracting but more effective
interaction than conventional systems. Enhancements to the current work could include
integration of contextual information, detailed large-scale user trials and the exploration
of using haptics within confined indoor spaces
- …