151,556 research outputs found

    A color hand gesture database for evaluating and improving algorithms on hand gesture and posture recognition

    Get PDF
    With the increase of research activities in vision-based hand posture and gesture recognition, new methods and algorithms are being developed. Although less attention is being paid to developing a standard platform for this purpose. Developing a database of hand gesture images is a necessary first step for standardizing the research on hand gesture recognition. For this purpose, we have developed an image database of hand posture and gesture images. The database contains hand images in different lighting conditions and collected using a digital camera. Details of the automatic segmentation and clipping of the hands are also discussed in this paper

    A Study of Dance Movement Capture and Posture Recognition Method Based on Vision Sensors

    Get PDF
    With the development of technology, posture recognition methods have been applied in more and more fields. However, there is relatively little research on posture recognition in dance. Therefore, this paper studied the capture and posture recognition of dance movements to understand the usability of the proposed method in dance posture recognition. Firstly, the Kinect V2 visual sensor was used to capture dance movements and obtain human skeletal joint data. Then, a three-dimensional convolutional neural network (3D CNN) model was designed by fusing joint coordinate features with joint velocity features as general features for recognizing different dance postures. Through experiments on NTU60 and self-built dance datasets, it was found that the 3D CNN performed best with a dropout rate of 0.4, a ReLU activation function, and fusion features. Compared to other posture recognition methods, the recognition rates of the 3D CNN on CS and CV in NTU60 were 88.8% and 95.3%, respectively, while the average recognition rate on the dance dataset reached 98.72%, which was higher than others. The experimental results demonstrate the effectiveness of our proposed method for dance posture recognition, providing a new approach for posture recognition research and making contributions to the inheritance of folk dances. Doi: 10.28991/HIJ-2023-04-02-03 Full Text: PD

    Do you see what I see? Co-actor posture modulates visual processing in joint tasks

    Get PDF
    Interacting with other people is a ubiquitous part of daily life. A complex set of processes enable our successful interactions with others. The present research was conducted to investigate how the processing of visual stimuli may be affected by the presence and the hand posture of a co-actor. Experiments conducted with participants acting alone have revealed that the distance from the stimulus to the hand of a participant can alter visual processing. In the main experiment of the present paper, we asked whether this posture-related source of visual bias persists when participants share the task with another person. The effect of personal and co-actor hand-proximity on visual processing was assessed through object-specific benefits to visual recognition in a task performed by two co-actors. Pairs of participants completed a joint visual recognition task and, across different blocks of trials, the position of their own hands and of their partner's hands varied relative to the stimuli. In contrast to control studies conducted with participants acting alone, an object-specific recognition benefit was found across all hand location conditions. These data suggest that visual processing is, in some cases, sensitive to the posture of a co-actor

    Human posture recognition : literature review

    Get PDF
    Human Posture giving machines the ability to detect, track and idenlify people and their actions from video. has become a central topic in computer vision research. Recognition of human posture is a very challenging problem. The importance of human posture recognition or c1assification is evident by the incrcasing requiremcnt of machines that are able to interact intelligently and effortlessly with a human inhabiled environment. Recognizing human posture in image and video is an important task in many multimedia applicalions. such as multimedia information retrieval, human computer interaction, and surveillance.Posture is a snapshot of human body configuration, Many research work focus on human action recognition which corresponds to the analysis of human motion. Thereby spatial and temporal characteristics of an object need to be considered. The estimation of the human body posture and the localization of the body parts is one way to analyze the spatial part

    Detection of postural transitions using machine learning

    Get PDF
    The purpose of this project is to study the nature of human activity recognition and prepare a dataset from volunteers doing various activities which can be used for constructing the various parts of a machine learning model which is used to identify each volunteers posture transitions accurately. This report presents the problem definition, equipment used, previous work in this area of human activity recognition and the resolution of the problem along with results. Also this report sheds light on the process and the steps taken to undertake this endeavour of human activity recognition such as building of a dataset, pre-processing the data by applying filters and various windowing length techniques, splitting the data into training and testing data, performance of feature selection and feature extraction and finally selecting the model for training and testing which provides maximum accuracy and least misclassification rates. The tools used for this project includes a laptop equipped with MATLAB and EXCEL and MEDIA PLAYER CLASSIC respectively which have been used for data processing, model training and feature selection and Labelling respectively. The data has been collected using an Inertial Measurement Unit contains 3 tri-axial Accelerometers, 1 Gyroscope, 1 Magnetometer and 1 Pressure sensor. For this project only the Accelerometers, Gyroscope and the Pressure sensor is used. The sensor is made by the members of the lab named ‘The Technical Research Centre for Dependency Care and Autonomous Living (CETpD) at the UPC-ETSEIB campus. The results obtained have been satisfactory, and the objectives set have been fulfilled. There is room for possible improvements through expanding the scope of the project such as detection of chronic disorders or providing posture based statistics to the end user or even just achieving a higher rate of sensitivity of transitions of posture by using better features and increasing the dataset size by increasing the number of volunteers.Incomin
    corecore