125,638 research outputs found
Non-visual information display using tactons
This paper describes a novel form of display using tactile output. Tactons, or tactile icons, are structured tactile messages that can be used to communicate message to users non visually. A range of different parameters can be used to construct Tactons, e.g.: frequency, amplitude, waveform and duration of a tactile pulse, plus body location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or on mobile and wearable devices
Crossmodal audio and tactile interaction with mobile touchscreens
Touchscreen mobile devices often use cut-down versions of desktop user interfaces placing high demands on the visual sense that may prove awkward in mobile settings. The research in this thesis addresses the problems encountered by situationally impaired mobile users by using crossmodal interaction to exploit the abundant similarities between the audio and tactile modalities. By making information available to both senses, users can receive the information in the most suitable way, without having to abandon their primary task to look at the device.
This thesis begins with a literature review of related work followed by a definition of crossmodal icons. Two icons may be considered to be crossmodal if and only if they provide a common representation of data, which is accessible interchangeably via different modalities. Two experiments investigated possible parameters for use in crossmodal icons with results showing that rhythm, texture and spatial location are effective.
A third experiment focused on learning multi-dimensional crossmodal icons and the extent to which this learning transfers between modalities. The results showed identification rates of 92% for three-dimensional audio crossmodal icons when trained in the tactile equivalents, and identification rates of 89% for tactile crossmodal icons when trained in the audio equivalent.
Crossmodal icons were then incorporated into a mobile touchscreen QWERTY keyboard. Experiments showed that keyboards with audio or tactile feedback produce fewer errors and greater speeds of text entry compared to standard touchscreen keyboards. The next study examined how environmental variables affect user performance with the same keyboard. The data showed that each modality performs differently with varying levels of background noise or vibration and the exact levels at which these performance decreases occur were established.
The final study involved a longitudinal evaluation of a touchscreen application, CrossTrainer, focusing on longitudinal effects on performance with audio and tactile feedback, the impact of context on performance and personal modality preference. The results show that crossmodal audio and tactile icons are a valid method of presenting information to situationally impaired mobile touchscreen users with recognitions rates of 100% over time. This thesis concludes with a set of guidelines on the design and application of crossmodal audio and tactile feedback to enable application and interface designers to employ such feedback in all systems
Bimodal Feedback for In-car Mid-air Gesture Interaction
This demonstration showcases novel multimodal feedback designs for in-car mid-air gesture interaction. It explores the potential of multimodal feedback types for mid-air gestures in cars and how these can reduce eyes-off-the-road time thus make driving safer. We will show four different bimodal feedback combinations to provide effective information about interaction with systems in a car. These feedback techniques are visual-auditory, auditory-ambient (peripheral vision), ambient-tactile, and tactile-auditory. Users can interact with the system after a short introduction, creating an exciting opportunity to deploy these displays in cars in the future
Design and User Satisfaction of Interactive Maps for Visually Impaired People
Multimodal interactive maps are a solution for presenting spatial information
to visually impaired people. In this paper, we present an interactive
multimodal map prototype that is based on a tactile paper map, a multi-touch
screen and audio output. We first describe the different steps for designing an
interactive map: drawing and printing the tactile paper map, choice of
multi-touch technology, interaction technologies and the software architecture.
Then we describe the method used to assess user satisfaction. We provide data
showing that an interactive map - although based on a unique, elementary,
double tap interaction - has been met with a high level of user satisfaction.
Interestingly, satisfaction is independent of a user's age, previous visual
experience or Braille experience. This prototype will be used as a platform to
design advanced interactions for spatial learning
"Touch me": workshop on tactile user experience evaluation methods
In this workshop we plan to explore the possibilities and challenges of physical objects and materials for evaluating the User Experience (UX) of interactive systems. These objects should face shortfalls of current UX evaluation methods and allow for a qualitative (or even quantitative), playful and holistic evaluation of UX -- without interfering with the users' personal experiences during interaction. This provides a tactile enhancement to a solely visual stimulation as used in classical evaluation methods. The workshop serves as a basis for networking and community building with interested HCI researchers, designers and practitioners and should encourage further development of the field of tactile UX evaluation
Two-handed navigation in a haptic virtual environment
This paper describes the initial results from a study looking at a two-handed interaction paradigm for tactile navigation for blind and visually impaired users. Participants were set the task of navigating a virtual maze environment using their dominant hand to move the cursor, while receiving contextual information in the form of tactile cues presented to their non-dominant hand. Results suggest that most participants were comfortable with the two-handed style of interaction even with little training. Two sets of contextual cues were examined with information presented through static patterns or tactile flow of raised pins. The initial results of this study suggest that while both sets of cues were usable, participants performed significantly better and faster with the static cues
Tactons: structured tactile messages for non-visual information display
Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given
- …
