305,344 research outputs found
Decoding Complex Imagery Hand Gestures
Brain computer interfaces (BCIs) offer individuals suffering from major
disabilities an alternative method to interact with their environment.
Sensorimotor rhythm (SMRs) based BCIs can successfully perform control tasks;
however, the traditional SMR paradigms intuitively disconnect the control and
real task, making them non-ideal for complex control scenarios. In this study,
we design a new, intuitively connected motor imagery (MI) paradigm using
hierarchical common spatial patterns (HCSP) and context information to
effectively predict intended hand grasps from electroencephalogram (EEG) data.
Experiments with 5 participants yielded an aggregate classification
accuracy--intended grasp prediction probability--of 64.5\% for 8 different hand
gestures, more than 5 times the chance level.Comment: This work has been submitted to EMBC 201
Vision-based hand gesture interaction using particle filter, principle component analysis and transition network
Vision-based human-computer interaction is becoming important nowadays. It offers natural interaction with computers and frees users from mechanical interaction devices, which is favourable especially for wearable computers. This paper presents a human-computer interaction system based on a conventional webcam and hand gesture recognition. This interaction system works in real time and enables users to control a computer cursor with hand motions and gestures instead of a mouse. Five hand gestures are designed on behalf of five mouse operations: moving, left click, left-double click, right click and no-action. An algorithm based on Particle Filter is used for tracking the hand position. PCA-based feature selection is used for recognizing the hand gestures. A transition network is also employed for improving the accuracy and reliability of the interaction system. This interaction system shows good performance in the recognition and interaction test
Simultaneous Localization and Recognition of Dynamic Hand Gestures
A framework for the simultaneous localization and recognition of dynamic hand gestures is proposed. At the core of this framework is a dynamic space-time warping (DSTW) algorithm, that aligns a pair of query and model gestures in both space and time. For every frame of the query sequence, feature detectors generate multiple hand region candidates. Dynamic programming is then used to compute both a global matching cost, which is used to recognize the query gesture, and a warping path, which aligns the query and model sequences in time, and also finds the best hand candidate region in every query frame. The proposed framework includes translation invariant recognition of gestures, a desirable property for many HCI systems. The performance of the approach is evaluated on a dataset of hand signed digits gestured by people wearing short sleeve shirts, in front of a background containing other non-hand skin-colored objects. The algorithm simultaneously localizes the gesturing hand and recognizes the hand-signed digit. Although DSTW is illustrated in a gesture recognition setting, the proposed algorithm is a general method for matching time series, that allows for multiple candidate feature vectors to be extracted at each time step.National Science Foundation (CNS-0202067, IIS-0308213, IIS-0329009); Office of Naval Research (N00014-03-1-0108
Dynamic Hand Gesture Classification Based on Radar Micro-Doppler Signatures
Dynamic hand gesture recognition is of great importance for human-computer interaction. In this paper, we present a method to discriminate the four kinds of dynamic hand gestures, snapping fingers, flipping fingers, hand rotation and calling, using a radar micro-Doppler sensor. Two micro-Doppler features are extracted from the time-frequency spectrum and the support vector machine is used to classify these four kinds of gestures. The experimental results on measured data demonstrate that the proposed method can produce a classification accuracy higher than 88.56%
Interaction With Tilting Gestures In Ubiquitous Environments
In this paper, we introduce a tilting interface that controls direction based
applications in ubiquitous environments. A tilt interface is useful for
situations that require remote and quick interactions or that are executed in
public spaces. We explored the proposed tilting interface with different
application types and classified the tilting interaction techniques. Augmenting
objects with sensors can potentially address the problem of the lack of
intuitive and natural input devices in ubiquitous environments. We have
conducted an experiment to test the usability of the proposed tilting interface
to compare it with conventional input devices and hand gestures. The experiment
results showed greater improvement of the tilt gestures in comparison with hand
gestures in terms of speed, accuracy, and user satisfaction.Comment: 13 pages, 10 figure
- …
