637,287 research outputs found
Integrating 2D Mouse Emulation with 3D Manipulation for Visualizations on a Multi-Touch Table
We present the Rizzo, a multi-touch virtual mouse that has been designed to provide the fine grained interaction for information visualization on a multi-touch table. Our solution enables touch interaction for existing mouse-based visualizations. Previously, this transition to a multi-touch environment was difficult because the mouse emulation of touch surfaces is often insufficient to provide full information visualization functionality. We present a unified design, combining many Rizzos that have been designed not only to provide mouse capabilities but also to act as zoomable lenses that make precise information access feasible. The Rizzos and the information visualizations all exist within a touch-enabled 3D window management system. Our approach permits touch interaction with both the 3D windowing environment as well as with the contents of the individual windows contained therein. We describe an implementation of our technique that augments the VisLink 3D visualization environment to demonstrate how to enable multi-touch capabilities on all visualizations written with the popular prefuse visualization toolkit.
Design and User Satisfaction of Interactive Maps for Visually Impaired People
Multimodal interactive maps are a solution for presenting spatial information
to visually impaired people. In this paper, we present an interactive
multimodal map prototype that is based on a tactile paper map, a multi-touch
screen and audio output. We first describe the different steps for designing an
interactive map: drawing and printing the tactile paper map, choice of
multi-touch technology, interaction technologies and the software architecture.
Then we describe the method used to assess user satisfaction. We provide data
showing that an interactive map - although based on a unique, elementary,
double tap interaction - has been met with a high level of user satisfaction.
Interestingly, satisfaction is independent of a user's age, previous visual
experience or Braille experience. This prototype will be used as a platform to
design advanced interactions for spatial learning
Making touch-based kiosks accessible to blind users through simple gestures
Touch-based interaction is becoming increasingly popular and is commonly used as the main interaction paradigm for self-service kiosks in public spaces. Touch-based interaction is known to be visually intensive, and current non-haptic touch-display technologies are often criticized as excluding blind users. This study set out to demonstrate that touch-based kiosks can be designed to include blind users without compromising the user experience for non-blind users. Most touch-based kiosks are based on absolute positioned virtual buttons which are difficult to locate without any tactile, audible or visual cues. However, simple stroke gestures rely on relative movements and the user does not need to hit a target at a specific location on the display. In this study, a touch-based train ticket sales kiosk based on simple stroke gestures was developed and tested on a panel of blind and visually impaired users, a panel of blindfolded non-visually impaired users and a control group of non-visually impaired users. The tests demonstrate that all the participants managed to discover, learn and use the touch-based self-service terminal and complete a ticket purchasing task. The majority of the participants completed the task in less than 4 min on the first attempt
Mid-Air Haptics for Control Interfaces
Control interfaces and interactions based on touch-less gesture tracking devices have become a prevalent research topic in both industry and academia. Touch-less devices offer a unique interaction immediateness that makes them ideal for applications where direct contact with a physical controller is not desirable. On the other hand, these controllers inherently lack active or passive haptic feedback to inform users about the results of their interaction. Mid-air haptic interfaces, such as those using focused ultrasound waves, can close the feedback loop and provide new tools for the design of touch-less, un-instrumented control interactions. The goal of this workshop is to bring together the growing mid-air haptic research community to identify and discuss future challenges in control interfaces and their application in AR/VR, automotive, music, robotics and teleoperation
Touch Screen Avatar English Learning System For University Students Learning Simplicity
This paper discusses on touch screen avatar for an English language learning application system. The system would be a combination of avatar as Animated Pedagogical Agent (APA) and a touch screen application that adapt the up to date gesture-based computing which is found as having potential to change the way how we learn as it could reduce the amount of Information Communication Technology (ICT) devices used during teaching and learning process. The key here is interaction between university students and touch screen avatar intelligent application system as well as learning resources that could be learned anytime anywhere twenty four hours in seven days 24/7 based on their study time preference where they could learn at their own comfort out of the tradition. The students would be provided with a learning tool that could help them learn interactively with the current trend which they might be interested with based on their own personalization. Apart from that, their performance shall be monitored from a distance and evaluated to avoid disturbing their learning process from working smoothly and getting rid of feeling of being controlled. Thus, the students are expected to have lower affective filter level that may enhance the way they learn unconsciously. Keywords: Gesture-Based Computing, Avatar, Portable Learning Tool, Interactivity, Language Learnin
- …
