1,095 research outputs found
Combining inertial and visual sensing for human action recognition in tennis
In this paper, we present a framework for both the automatic extraction of the temporal location of tennis strokes within a match and the subsequent classification of these as being either a serve, forehand or backhand. We employ the use of low-cost visual sensing and low-cost inertial sensing to achieve these aims, whereby a single modality can be used or a fusion of both classification strategies can be adopted if both modalities are available within a given capture scenario. This flexibility allows the framework to be applicable to a variety of user scenarios and hardware infrastructures. Our proposed approach is quantitatively evaluated using data captured from elite tennis players. Results point to the extremely accurate performance of the proposed approach irrespective of input modality configuration
Recognition of elementary arm movements using orientation of a tri-axial accelerometer located near the wrist
In this paper we present a method for recognising three fundamental movements of the human arm (reach and retrieve, lift cup to mouth, rotation of the arm) by determining the orientation of a tri-axial accelerometer located near the wrist. Our objective is to detect the occurrence of such movements performed with the impaired arm of a stroke patient during normal daily activities as a means to assess their rehabilitation. The method relies on accurately mapping transitions of predefined, standard orientations of the accelerometer to corresponding elementary arm movements. To evaluate the technique, kinematic data was collected from four healthy subjects and four stroke patients as they performed a number of activities involved in a representative activity of daily living, 'making-a-cup-of-tea'. Our experimental results show that the proposed method can independently recognise all three of the elementary upper limb movements investigated with accuracies in the range 91â99% for healthy subjects and 70â85% for stroke patients
Active User Authentication for Smartphones: A Challenge Data Set and Benchmark Results
In this paper, automated user verification techniques for smartphones are
investigated. A unique non-commercial dataset, the University of Maryland
Active Authentication Dataset 02 (UMDAA-02) for multi-modal user authentication
research is introduced. This paper focuses on three sensors - front camera,
touch sensor and location service while providing a general description for
other modalities. Benchmark results for face detection, face verification,
touch-based user identification and location-based next-place prediction are
presented, which indicate that more robust methods fine-tuned to the mobile
platform are needed to achieve satisfactory verification accuracy. The dataset
will be made available to the research community for promoting additional
research.Comment: 8 pages, 12 figures, 6 tables. Best poster award at BTAS 201
On the role of gestures in human-robot interaction
This thesis investigates the gestural interaction problem and in particular the usage of gestures for human-robot interaction. The lack of a clear definition of the problem statement and a common terminology resulted in a fragmented field of research where building upon prior work is rare. The scope of the research presented in this thesis, therefore, consists in laying the foundation to help the community to build a more homogeneous research field.
The main contributions of this thesis are twofold: (i) a taxonomy to define gestures; and (ii) an ingegneristic definition of the gestural interaction problem. The contributions resulted is a schema to represent the existing literature in a more organic way, helping future researchers to identify existing technologies and applications, also thanks to an extensive literature review.
Furthermore, the defined problem has been studied in two of its specialization: (i) direct control and (ii) teaching of a robotic manipulator, which leads to the development of technological solutions for gesture sensing, detection and classification, which can possibly be applied to other contexts
Towards Stroke Patients' Upper-limb Automatic Motor Assessment Using Smartwatches
Assessing the physical condition in rehabilitation scenarios is a challenging
problem, since it involves Human Activity Recognition (HAR) and kinematic
analysis methods. In addition, the difficulties increase in unconstrained
rehabilitation scenarios, which are much closer to the real use cases. In
particular, our aim is to design an upper-limb assessment pipeline for stroke
patients using smartwatches. We focus on the HAR task, as it is the first part
of the assessing pipeline. Our main target is to automatically detect and
recognize four key movements inspired by the Fugl-Meyer assessment scale, which
are performed in both constrained and unconstrained scenarios. In addition to
the application protocol and dataset, we propose two detection and
classification baseline methods. We believe that the proposed framework,
dataset and baseline results will serve to foster this research field
- âŠ