2,918 research outputs found

    STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits

    Full text link
    We present a novel classifier network called STEP, to classify perceived human emotion from gaits, based on a Spatial Temporal Graph Convolutional Network (ST-GCN) architecture. Given an RGB video of an individual walking, our formulation implicitly exploits the gait features to classify the emotional state of the human into one of four emotions: happy, sad, angry, or neutral. We use hundreds of annotated real-world gait videos and augment them with thousands of annotated synthetic gaits generated using a novel generative network called STEP-Gen, built on an ST-GCN based Conditional Variational Autoencoder (CVAE). We incorporate a novel push-pull regularization loss in the CVAE formulation of STEP-Gen to generate realistic gaits and improve the classification accuracy of STEP. We also release a novel dataset (E-Gait), which consists of 2,1772,177 human gaits annotated with perceived emotions along with thousands of synthetic gaits. In practice, STEP can learn the affective features and exhibits classification accuracy of 89% on E-Gait, which is 14 - 30% more accurate over prior methods

    Volitional Control of Lower-limb Prosthesis with Vision-assisted Environmental Awareness

    Get PDF
    Early and reliable prediction of user’s intention to change locomotion mode or speed is critical for a smooth and natural lower limb prosthesis. Meanwhile, incorporation of explicit environmental feedback can facilitate context aware intelligent prosthesis which allows seamless operation in a variety of gait demands. This dissertation introduces environmental awareness through computer vision and enables early and accurate prediction of intention to start, stop or change speeds while walking. Electromyography (EMG), Electroencephalography (EEG), Inertial Measurement Unit (IMU), and Ground Reaction Force (GRF) sensors were used to predict intention to start, stop or increase walking speed. Furthermore, it was investigated whether external emotional music stimuli could enhance the predictive capability of intention prediction methodologies. Application of advanced machine learning and signal processing techniques on pre-movement EEG resulted in an intention prediction system with low latency, high sensitivity and low false positive detection. Affective analysis of EEG suggested that happy music stimuli significantly (

    Analysis of Affective State as Covariate in Human Gait Identification

    Get PDF
    There is an increased interest in the need for a noninvasive and nonintrusive biometric identification and recognition system such as Automatic Gait Identification (AGI) due to the rise in crime rates in the US, physical assaults, and global terrorism in public places. AGI, a biometric system based on human gait, can recognize people from a distance and current literature shows that AGI has a 95.75% success rate in a closely controlled laboratory environment. Also, this success rate does not take into consideration the effect of covariate factors such as affective state (mood state); and literature shows that there is a lack of understanding of the effect of affective state on gait biometrics. The purpose of this study was to determine the percent success rate of AGI in an uncontrolled outdoor environment with affective state as the main variable. Affective state was measured using the Profile of Mood State (POMS) scales. Other covariate factors such as footwear or clothes were not considered in this study. The theoretical framework that grounded this study was Murray\u27s theory of total walking cycle. This study included the gait signature of 24 participants from a population of 62 individuals, sampled based on simple random sampling. This quantitative research used empirical methods and a Fourier Series Analysis. Results showed that AGI has a 75% percent success rate in an uncontrolled outdoor environment with affective state. This study contributes to social change by enhancing an understanding of the effect of affective state on gait biometrics for positive identification during and after a crime such as bank robbery when the use of facial identification from a surveillance camera is either not clear or not possible. This may also be used in other countries to detect suicide bombers from a distance

    From pixels to affect : a study on games and player experience

    Get PDF
    Is it possible to predict the affect of a user just by observing her behavioral interaction through a video? How can we, for instance, predict a user’s arousal in games by merely looking at the screen during play? In this paper we address these questions by employing three dissimilar deep convolutional neural network architectures in our attempt to learn the underlying mapping between video streams of gameplay and the player’s arousal. We test the algorithms in an annotated dataset of 50 gameplay videos of a survival shooter game and evaluate the deep learned models’ capacity to classify high vs low arousal levels. Our key findings with the demanding leave-onevideo- out validation method reveal accuracies of over 78% on average and 98% at best. While this study focuses on games and player experience as a test domain, the findings and methodology are directly relevant to any affective computing area, introducing a general and user-agnostic approach for modeling affect.This paper is funded, in part, by the H2020 project Com N Play Science (project no: 787476).peer-reviewe
    • …
    corecore