1,386 research outputs found
GART: The Gesture and Activity Recognition Toolkit
Presented at the 12th International Conference on Human-Computer Interaction, Beijing, China, July 2007.The original publication is available at www.springerlink.comThe Gesture and Activity Recognition Toolit (GART) is
a user interface toolkit designed to enable the development of gesture-based
applications. GART provides an abstraction to machine learning
algorithms suitable for modeling and recognizing different types of
gestures. The toolkit also provides support for the data collection and
the training process. In this paper, we present GART and its machine
learning abstractions. Furthermore, we detail the components of the
toolkit and present two example gesture recognition applications
MOCA: A Low-Power, Low-Cost Motion Capture System Based on Integrated Accelerometers
Human-computer interaction (HCI) and virtual reality applications pose the challenge of enabling real-time interfaces for natural interaction. Gesture recognition based on body-mounted accelerometers has been proposed as a viable solution to translate patterns of movements that are associated with user commands, thus substituting point-and-click methods or other cumbersome input devices. On the other hand, cost and power constraints make the implementation of a natural and efficient interface suitable for consumer applications a critical task. Even though several gesture recognition solutions exist, their use in HCI context has been poorly characterized. For this reason, in this paper, we consider a low-cost/low-power wearable motion tracking system based on integrated accelerometers called motion capture with accelerometers (MOCA) that we evaluated for navigation in virtual spaces. Recognition is based on a geometric algorithm that enables efficient and robust detection of rotational movements. Our objective is to demonstrate that such a low-cost and a low-power implementation is suitable for HCI applications. To this purpose, we characterized the system from both a quantitative point of view and a qualitative point of view. First, we performed static and dynamic assessment of movement recognition accuracy. Second, we evaluated the effectiveness of user experience using a 3D game application as a test bed
Gesture recognition using mobile phone's inertial sensors
The availability of inertial sensors embedded in mobile devices has enabled a new type of interaction based on the movements or “gestures” made by the users when holding the device. In this paper we propose a gesture recognition system for mobile devices based on accelerometer and gyroscope measurements. The system is capable of recognizing a set of predefined gestures in a user-independent way, without the need of a training phase. Furthermore, it was designed to be executed in real-time in resource-constrained devices, and therefore has a low computational complexity. The performance of the system is evaluated offline using a dataset of gestures, and also online, through some user tests with the system running in a smart phone
Universal Device Access with FreeMote
With the FreeMote system, we present a solution for easy and intuitive universal device control. Based on the Wii Remote Controller real world device functions can be accessed and controlled via gestures. Gestures include predefined sets apt for standard functions present in many devices as well as user definable gestures. Offering both possibilities, the users are free to choose and adapt the gestures to their needs while reducing the mental load. Along with the gesture interface, FreeMote offers a simple way of enabling everyday devices for FreeMote controlling.
FreeMote makes use of the spatial nature of real world environments in device selection and in function controlling via gestures. Thus, FreeMote system supports
the user in finding enabled devices, makes selecting devices easy and provides an intuitive way of controlling devices via gestures
- …