3,089 research outputs found
Personalising Vibrotactile Displays through Perceptual Sensitivity Adjustment
Haptic displays are commonly limited to transmitting a discrete
set of tactile motives. In this paper, we explore the
transmission of real-valued information through vibrotactile
displays. We simulate spatial continuity with three perceptual
models commonly used to create phantom sensations: the linear,
logarithmic and power model. We show that these generic
models lead to limited decoding precision, and propose a
method for model personalization adjusting to idiosyncratic
and spatial variations in perceptual sensitivity. We evaluate
this approach using two haptic display layouts: circular, worn
around the wrist and the upper arm, and straight, worn along
the forearm. Results of a user study measuring continuous
value decoding precision show that users were able to decode
continuous values with relatively high accuracy (4.4% mean
error), circular layouts performed particularly well, and personalisation
through sensitivity adjustment increased decoding
precision
Instructional eLearning technologies for the vision impaired
The principal sensory modality employed in learning is vision, and that not only increases the difficulty for vision impaired students from accessing existing educational media but also the new and mostly visiocentric learning materials being offered through on-line delivery mechanisms. Using as a reference Certified Cisco Network Associate (CCNA) and IT Essentials courses, a study has been made of tools that can access such on-line systems and transcribe the materials into a form suitable for vision impaired learning. Modalities employed included haptic, tactile, audio and descriptive text. How such a multi-modal approach can achieve equivalent success for the vision impaired is demonstrated. However, the study also shows the limits of the current understanding of human perception, especially with respect to comprehending two and three dimensional objects and spaces when there is no recourse to vision
Wayfinding and Navigation for People with Disabilities Using Social Navigation Networks
To achieve safe and independent mobility, people usually depend on published information, prior experience, the knowledge of others, and/or technology to navigate unfamiliar outdoor and indoor environments. Today, due to advances in various technologies, wayfinding and navigation systems and services are commonplace and are accessible on desktop, laptop, and mobile devices. However, despite their popularity and widespread use, current wayfinding and navigation solutions often fail to address the needs of people with disabilities (PWDs). We argue that these shortcomings are primarily due to the ubiquity of the compute-centric approach adopted in these systems and services, where they do not benefit from the experience-centric approach. We propose that following a hybrid approach of combining experience-centric and compute-centric methods will overcome the shortcomings of current wayfinding and navigation solutions for PWDs
Multimodality with Eye tracking and Haptics: A New Horizon for Serious Games?
The goal of this review is to illustrate the emerging use of multimodal virtual reality that can benefit learning-based games. The review begins with an introduction to multimodal virtual reality in serious games and we provide a brief discussion of why cognitive processes involved in learning and training are enhanced under immersive virtual environments. We initially outline studies that have used eye tracking and haptic feedback independently in serious games, and then review some innovative applications that have already combined eye tracking and haptic devices in order to provide applicable multimodal frameworks for learning-based games. Finally, some general conclusions are identified and clarified in order to advance current understanding in multimodal serious game production as well as exploring possible areas for new applications
Include 2011 : The role of inclusive design in making social innovation happen.
Include is the biennial conference held at the RCA and hosted by the Helen Hamlyn Centre for Design. The event is directed by Jo-Anne Bichard and attracts an international delegation
Analysis domain model for shared virtual environments
The field of shared virtual environments, which also
encompasses online games and social 3D environments, has a
system landscape consisting of multiple solutions that share great functional overlap. However, there is little system interoperability between the different solutions. A shared virtual environment has an associated problem domain that is highly complex raising difficult challenges to the development process, starting with the architectural design of the underlying system. This paper has two main contributions. The first contribution is a broad domain analysis of shared virtual environments, which enables developers to have a better understanding of the whole rather than the part(s). The second contribution is a reference domain model for discussing and describing solutions - the Analysis Domain Model
Integrating Haptic Feedback into Mobile Location Based Services
Haptics is a feedback technology that takes advantage of the human sense of touch by
applying forces, vibrations, and/or motions to a haptic-enabled device such as a mobile
phone. Historically, human-computer interaction has been visual - text and images on
the screen. Haptic feedback can be an important additional method especially in Mobile
Location Based Services such as knowledge discovery, pedestrian navigation and notification
systems. A knowledge discovery system called the Haptic GeoWand is a low
interaction system that allows users to query geo-tagged data around them by using
a point-and-scan technique with their mobile device. Haptic Pedestrian is a navigation
system for walkers. Four prototypes have been developed classified according to
the userâs guidance requirements, the user type (based on spatial skills), and overall
system complexity. Haptic Transit is a notification system that provides spatial information
to the users of public transport. In all these systems, haptic feedback is used
to convey information about location, orientation, density and distance by use of the
vibration alarm with varying frequencies and patterns to help understand the physical
environment. Trials elicited positive responses from the users who see benefit in being
provided with a âheads upâ approach to mobile navigation. Results from a memory recall
test show that the users of haptic feedback for navigation had better memory recall
of the region traversed than the users of landmark images. Haptics integrated into a
multi-modal navigation system provides more usable, less distracting but more effective
interaction than conventional systems. Enhancements to the current work could include
integration of contextual information, detailed large-scale user trials and the exploration
of using haptics within confined indoor spaces
A survey of haptics in serious gaming
Serious gaming often requires high level of realism for training and learning purposes. Haptic technology has been proved to be useful in many applications with an additional perception modality complementary to the audio and the vision. It provides novel user experience to enhance the immersion of virtual reality with a physical control-layer. This survey focuses on the haptic technology and its applications in serious gaming. Several categories of related applications are listed and discussed in details, primarily on haptics acts as cognitive aux and main component in serious games design. We categorize haptic devices into tactile, force feedback and hybrid ones to suit different haptic interfaces, followed by description of common haptic gadgets in gaming. Haptic modeling methods, in particular, available SDKs or libraries either for commercial or academic usage, are summarized. We also analyze the existing research difficulties and technology bottleneck with haptics and foresee the future research directions
- âŠ