385 research outputs found

    Extraction of bodily features for gait recognition and gait attractiveness evaluation

    Get PDF
    This is the author's accepted manuscript. The final publication is available at Springer via http://dx.doi.org/10.1007/s11042-012-1319-2. Copyright @ 2012 Springer.Although there has been much previous research on which bodily features are most important in gait analysis, the questions of which features should be extracted from gait, and why these features in particular should be extracted, have not been convincingly answered. The primary goal of the study reported here was to take an analytical approach to answering these questions, in the context of identifying the features that are most important for gait recognition and gait attractiveness evaluation. Using precise 3D gait motion data obtained from motion capture, we analyzed the relative motions from different body segments to a root marker (located on the lower back) of 30 males by the fixed root method, and compared them with the original motions without fixing root. Some particular features were obtained by principal component analysis (PCA). The left lower arm, lower legs and hips were identified as important features for gait recognition. For gait attractiveness evaluation, the lower legs were recognized as important features.Dorothy Hodgkin Postgraduate Award and HEFCE

    Gait Recognition from Motion Capture Data

    Full text link
    Gait recognition from motion capture data, as a pattern classification discipline, can be improved by the use of machine learning. This paper contributes to the state-of-the-art with a statistical approach for extracting robust gait features directly from raw data by a modification of Linear Discriminant Analysis with Maximum Margin Criterion. Experiments on the CMU MoCap database show that the suggested method outperforms thirteen relevant methods based on geometric features and a method to learn the features by a combination of Principal Component Analysis and Linear Discriminant Analysis. The methods are evaluated in terms of the distribution of biometric templates in respective feature spaces expressed in a number of class separability coefficients and classification metrics. Results also indicate a high portability of learned features, that means, we can learn what aspects of walk people generally differ in and extract those as general gait features. Recognizing people without needing group-specific features is convenient as particular people might not always provide annotated learning data. As a contribution to reproducible research, our evaluation framework and database have been made publicly available. This research makes motion capture technology directly applicable for human recognition.Comment: Preprint. Full paper accepted at the ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM), special issue on Representation, Analysis and Recognition of 3D Humans. 18 pages. arXiv admin note: substantial text overlap with arXiv:1701.00995, arXiv:1609.04392, arXiv:1609.0693

    Human gait identification and analysis

    Get PDF
    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Human gait identification has become an active area of research due to increased security requirements. Human gait identification is a potential new tool for identifying individuals beyond traditional methods. The emergence of motion capture techniques provided a chance of high accuracy in identification because completely recorded gait information can be recorded compared with security cameras. The aim of this research was to build a practical method of gait identification and investigate the individual characteristics of gait. For this purpose, a gait identification approach was proposed, identification results were compared by different methods, and several studies about the individual characteristics of gait were performed. This research included the following: (1) a novel, effective set of gait features were proposed; (2) gait signatures were extracted by three different methods: statistical method, principal component analysis, and Fourier expansion method; (3) gait identification results were compared by these different methods; (4) two indicators were proposed to evaluate gait features for identification; (5) novel and clear definitions of gait phases and gait cycle were proposed; (6) gait features were investigated by gait phases; (7) principal component analysis and the fixing root method were used to elucidate which features were used to represent gait and why; (8) gait similarity was investigated; (9) gait attractiveness was investigated. This research proposed an efficient framework for identifying individuals from gait via a novel feature set based on 3D motion capture data. A novel evaluating method of gait signatures for identification was proposed. Three different gait signature extraction methods were applied and compared. The average identification rate was over 93%, with the best result close to 100%. This research also proposed a novel dividing method of gait phases, and the different appearances of gait features in eight gait phases were investigated. This research identified the similarities and asymmetric appearances between left body movement and right body movement in gait based on the proposed gait phase dividing method. This research also initiated an analysing method for gait features extraction by the fixing root method. A prediction model of gait attractiveness was built with reasonable accuracy by principal component analysis and linear regression of natural logarithm of parameters. A systematic relationship was observed between the motions of individual markers and the attractiveness ratings. The lower legs and feet were extracted as features of attractiveness by the fixing root method. As an extension of gait research, human seated motion was also investigated.This study is funded by the Dorothy Hodgkin Postgraduate Awards and Beijing East Gallery Co. Ltd

    Body Motion Cues Drive First Impressions: Consensus, Truth and the Origins of Personality Trait Judgements based on Targets' Whole-Body Motion.

    Get PDF
    Personality trait attribution is automatic, and first impressions can be lasting and lead to important social decisions. Research on how facial cues impact on person perception is plentiful, but less is known about how whole-body motion contributes to first impressions. This thesis presents results from experiments in which participants rated the traits of target individuals based solely on short, silent movie clips of those individuals performing actions or expressing emotions with their bodies, or simply walking. To isolate the contribution to trait attribution of body motion cues, the static form information of the body stimuli was degraded. Consensus at zero acquaintance is replicated throughout the thesis, as manifested by strong inter-rater agreement for all rating experiments and within all displayed behaviours, thus indicating that body motion may contain visual cues that drive trait impressions. Further experiments identified motion parameters that predict personality trait impressions, and an experimental paradigm showed that computational manipulation of motion data can indeed change observer judgements of computerised models based on human motion data. No accuracy was found in the trait judgements, in that there was no link between how a target was judged and this target individual's scores on a five-factor personality questionnaire. Underlying judgements driving personality trait impressions were found: impressions of emotions, attractiveness and masculinity appear to be intertwined with personality trait judgements. Finally, patterns in personality trait judgements based on body motion were consistent with findings from studies on face perception, reflecting a two-step judgement of a target person's intention and ability to cause harm. Differences were found depending on the display format of the stimuli, and interpretations for these discrepancies are offered. The thesis shows that people go beyond the information available to them when forming personality trait impressions of strangers, and offers evidence that changes in body motion may indeed have an impact on such trait impressions

    Male Movements as Honest Cues to Reproductive Quality

    Get PDF
    Background: Research concerning sexual selection suggests that ornaments and traits convey information that is valuable to observers when making decisions based on adaptive problems. In the animal kingdom males perform dynamic courtship displays and females assess such displays when choosing a mate. In humans however this avenue of research is in its infancy but an emerging field of study has sought to find out if dance movements, which are thought to be courtship displays, provide observers with condition dependent information. Objectives: i) To create a methodology that records dance movements with high accuracy whilst eliminating structural cues known to influence mate choice decisions while maintaining a highly realistic human form. ii) Use this methodology to assess whether traits of interest (health, fitness, strength and age) can be detected by observers. iii) To establish if particular movements are mediating perceptions of dance quality and their condition. Methods: A cutting edge motion capture system and professional animation software was used to record dances. Each male dancer either provided information on his health status, physical fitness, strength or age. Dance animations were shown to observers and their perceptions were correlated against the traits of interest. These were also correlated against basic biomechanical characteristics to establish possible mediators. Results: It was revealed that whilst health measures were not related to dance ratings, strength measures were and these perceptions were mediated by movements of the upper body. A final study found that age was detectable by male participants and related to masculinity ratings of female raters but no biomechanical mediators were found. Conclusion: Men and women are able to derive certain quality cues from observing male dance and in some instances biomechanical characteristics mediated this relationship. This provides evidence that dance may be used in the assessment of males in the context of sexual selection

    Inferring Facial and Body Language

    Get PDF
    Machine analysis of human facial and body language is a challenging topic in computer vision, impacting on important applications such as human-computer interaction and visual surveillance. In this thesis, we present research building towards computational frameworks capable of automatically understanding facial expression and behavioural body language. The thesis work commences with a thorough examination in issues surrounding facial representation based on Local Binary Patterns (LBP). Extensive experiments with different machine learning techniques demonstrate that LBP features are efficient and effective for person-independent facial expression recognition, even in low-resolution settings. We then present and evaluate a conditional mutual information based algorithm to efficiently learn the most discriminative LBP features, and show the best recognition performance is obtained by using SVM classifiers with the selected LBP features. However, the recognition is performed on static images without exploiting temporal behaviors of facial expression. Subsequently we present a method to capture and represent temporal dynamics of facial expression by discovering the underlying low-dimensional manifold. Locality Preserving Projections (LPP) is exploited to learn the expression manifold in the LBP based appearance feature space. By deriving a universal discriminant expression subspace using a supervised LPP, we can effectively align manifolds of different subjects on a generalised expression manifold. Different linear subspace methods are comprehensively evaluated in expression subspace learning. We formulate and evaluate a Bayesian framework for dynamic facial expression recognition employing the derived manifold representation. However, the manifold representation only addresses temporal correlations of the whole face image, does not consider spatial-temporal correlations among different facial regions. We then employ Canonical Correlation Analysis (CCA) to capture correlations among face parts. To overcome the inherent limitations of classical CCA for image data, we introduce and formalise a novel Matrix-based CCA (MCCA), which can better measure correlations in 2D image data. We show this technique can provide superior performance in regression and recognition tasks, whilst requiring significantly fewer canonical factors. All the above work focuses on facial expressions. However, the face is usually perceived not as an isolated object but as an integrated part of the whole body, and the visual channel combining facial and bodily expressions is most informative. Finally we investigate two understudied problems in body language analysis, gait-based gender discrimination and affective body gesture recognition. To effectively combine face and body cues, CCA is adopted to establish the relationship between the two modalities, and derive a semantic joint feature space for the feature-level fusion. Experiments on large data sets demonstrate that our multimodal systems achieve the superior performance in gender discrimination and affective state analysis.Research studentship of Queen Mary, the International Travel Grant of the Royal Academy of Engineering, and the Royal Society International Joint Project

    Inter- and Intrapersonal Body Perception in Schizophrenia

    Get PDF

    KEER2022

    Get PDF
    Avanttítol: KEER2022. DiversitiesDescripció del recurs: 25 juliol 202
    corecore