43,347 research outputs found
Ambient Gestures
We present Ambient Gestures, a novel gesture-based system designed to support ubiquitous ‘in the environment’ interactions with everyday computing technology. Hand gestures and audio feedback allow users to control computer applications without reliance on a graphical user interface, and without having to switch from the context of a non-computer task to the context of the computer. The Ambient Gestures system is composed of a vision recognition software application, a set of gestures to be processed by a scripting application and a navigation and selection application that is controlled by the gestures. This system allows us to explore gestures as the primary means of interaction within a multimodal, multimedia environment. In this paper we describe the Ambient Gestures system, define the gestures and the interactions that can be achieved in this environment and present a formative study of the system. We conclude with a discussion of our findings and future applications of Ambient Gestures in ubiquitous computing
Interaction With Tilting Gestures In Ubiquitous Environments
In this paper, we introduce a tilting interface that controls direction based
applications in ubiquitous environments. A tilt interface is useful for
situations that require remote and quick interactions or that are executed in
public spaces. We explored the proposed tilting interface with different
application types and classified the tilting interaction techniques. Augmenting
objects with sensors can potentially address the problem of the lack of
intuitive and natural input devices in ubiquitous environments. We have
conducted an experiment to test the usability of the proposed tilting interface
to compare it with conventional input devices and hand gestures. The experiment
results showed greater improvement of the tilt gestures in comparison with hand
gestures in terms of speed, accuracy, and user satisfaction.Comment: 13 pages, 10 figure
Recommended from our members
Human-display interaction technology: Emerging remote interfaces for pervasive display environments
This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2010 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.We're living in a world where information processing isn't confined to desktop computers - it's being integrated into everyday objects and activities. Pervasive computation is human centered: it permeates our physical world, helping us achieve goals and fulfill our needs with minimum effort by exploiting natural interaction styles. Remote interaction with screen displays requires a sensor-based, multimodal, touchless approach. For example, by processing user hand gestures, this paradigm removes constraints requiring physical contact and permits natural interaction with tangible digital information. Such touchless interaction can be multimodal, exploiting the visual, auditory, and olfactory senses.Ministerio de Educación y Ciencia and Amper Sistemas, SA
Levitating Particle Displays with Interactive Voxels
Levitating objects can be used as the primitives in a new type of display. We present levitating particle displays and show how research into object levitation is enabling a new way of presenting and interacting with information. We identify novel properties of levitating particle displays and give examples of the interaction techniques and applications they allow. We then discuss design challenges for these displays, potential solutions, and promising areas for future research
Rhythmic Micro-Gestures: Discreet Interaction On-the-Go
We present rhythmic micro-gestures, micro-movements of the hand that are repeated in time with a rhythm. We present a user study that investigated how well users can perform rhythmic micro-gestures and if they can use them eyes-free with non-visual feedback. We found that users could successfully use our interaction technique (97% success rate across all gestures) with short interaction times, rating them as low difficulty as well. Simple audio cues that only convey the rhythm outperformed animations showing the hand movements, supporting rhythmic micro-gestures as an eyes-free input technique
- …